NOTEBOOK Description 📗¶This notebook contains the below components:
NOTEBOOK Contents ✍️¶Models Set - 1¶!pip install tensorflow-addons
Looking in indexes: https://pypi.org/simple, https://us-python.pkg.dev/colab-wheels/public/simple/
Collecting tensorflow-addons
Downloading tensorflow_addons-0.18.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (1.1 MB)
|████████████████████████████████| 1.1 MB 26.3 MB/s
Requirement already satisfied: typeguard>=2.7 in /usr/local/lib/python3.7/dist-packages (from tensorflow-addons) (2.7.1)
Requirement already satisfied: packaging in /usr/local/lib/python3.7/dist-packages (from tensorflow-addons) (21.3)
Requirement already satisfied: pyparsing!=3.0.5,>=2.0.2 in /usr/local/lib/python3.7/dist-packages (from packaging->tensorflow-addons) (3.0.9)
Installing collected packages: tensorflow-addons
Successfully installed tensorflow-addons-0.18.0
## TRAIN set files
!gdown 1wLqG-v6Gzhx4Ap81WBQadoImo0-EcImB
Downloading... From: https://drive.google.com/uc?id=1wLqG-v6Gzhx4Ap81WBQadoImo0-EcImB To: /content/Dataset.zip 100% 817M/817M [00:03<00:00, 254MB/s]
!unzip Dataset.zip
import os
import sys
import math
import scipy as scipy
import numpy as np
import pandas as pd
import matplotlib.pyplot as plt
import seaborn as sns
import cv2
import plotly.express as px
import plotly.graph_objects as go
import hashlib
from IPython.display import display
from sklearn.utils import class_weight
from PIL import Image
from tqdm import tqdm
tqdm.pandas()
%matplotlib inline
pd.set_option('display.max_columns', 30)
label_font_dict = {'family':'sans-serif','size':13.5,'color':'brown','style':'italic'}
ticks_font_dict = {'family':'sans-serif','size':11,'color':'black','style':'italic'}
title_font_dict = {'family':'sans-serif','size':17.5,'color':'Blue','style':'italic'}
all_images_loc = os.path.join(os.path.join(os.path.join(os.path.join(os.getcwd(),"Dataset"), "Pathology_2020")), "Images")
all_images_loc
'/content/Dataset/Pathology_2020/Images'
train_test_csv_loc = os.path.join(os.path.join(os.path.join(os.getcwd(),"Dataset"), "Pathology_2020"))
train_test_csv_loc
'/content/Dataset/Pathology_2020'
# plant pathology 2020
pp_2020_train_csv_file = "train.csv"
pp_2020_test_csv_file = "test.csv"
## train images csv file name
train_images_info_csv = os.path.join(train_test_csv_loc, pp_2020_train_csv_file)
print("### Train images csv file is --> {} \n".format(train_images_info_csv))
## train images details
train_images_info_df = pd.read_csv(train_images_info_csv)
train_images_info_df
### Train images csv file is --> /content/Dataset/Pathology_2020/train.csv
| image_id | healthy | multiple_diseases | rust | scab | |
|---|---|---|---|---|---|
| 0 | Train_0 | 0 | 0 | 0 | 1 |
| 1 | Train_1 | 0 | 1 | 0 | 0 |
| 2 | Train_2 | 1 | 0 | 0 | 0 |
| 3 | Train_3 | 0 | 0 | 1 | 0 |
| 4 | Train_4 | 1 | 0 | 0 | 0 |
| ... | ... | ... | ... | ... | ... |
| 1816 | Train_1816 | 0 | 0 | 0 | 1 |
| 1817 | Train_1817 | 1 | 0 | 0 | 0 |
| 1818 | Train_1818 | 1 | 0 | 0 | 0 |
| 1819 | Train_1819 | 0 | 0 | 1 | 0 |
| 1820 | Train_1820 | 0 | 0 | 0 | 1 |
1821 rows × 5 columns
## test images csv file name
test_images_info_csv = os.path.join(train_test_csv_loc, pp_2020_test_csv_file)
print("### Test images csv file is --> {} \n".format(test_images_info_csv))
## test images details
test_images_info_df = pd.read_csv(test_images_info_csv)
test_images_info_df
### Test images csv file is --> /content/Dataset/Pathology_2020/test.csv
| image_id | |
|---|---|
| 0 | Test_0 |
| 1 | Test_1 |
| 2 | Test_2 |
| 3 | Test_3 |
| 4 | Test_4 |
| ... | ... |
| 1816 | Test_1816 |
| 1817 | Test_1817 |
| 1818 | Test_1818 |
| 1819 | Test_1819 |
| 1820 | Test_1820 |
1821 rows × 1 columns
# checking data types of the columns
test_images_info_df.dtypes
image_id object dtype: object
Data_Preparation¶import random as rn
import datetime
import time
import pathlib
import tensorflow as tf
import tensorflow_addons as tfa
from tensorflow.keras.models import Model
from tensorflow.keras.layers import Dense, Input, Dropout, BatchNormalization, Activation, Flatten, Conv1D, Conv2D, Concatenate, Lambda
from tensorflow.keras.optimizers import Adam, RMSprop
from tensorflow.keras.utils import plot_model
from tensorflow.keras.layers import MaxPool1D, MaxPool2D, GlobalAveragePooling1D, GlobalAveragePooling2D, AveragePooling1D, AveragePooling2D, SpatialDropout2D
from keras.regularizers import l1, l2
from keras.preprocessing import image
from keras.callbacks import Callback
from keras.callbacks import TensorBoard
from sklearn.metrics import accuracy_score, recall_score, f1_score, precision_score, roc_auc_score, confusion_matrix
from tensorboard import notebook
from sklearn.model_selection import train_test_split
all_images_loc
'/content/Dataset/Pathology_2020/Images'
train_images_info_df
| image_id | healthy | multiple_diseases | rust | scab | |
|---|---|---|---|---|---|
| 0 | Train_0 | 0 | 0 | 0 | 1 |
| 1 | Train_1 | 0 | 1 | 0 | 0 |
| 2 | Train_2 | 1 | 0 | 0 | 0 |
| 3 | Train_3 | 0 | 0 | 1 | 0 |
| 4 | Train_4 | 1 | 0 | 0 | 0 |
| ... | ... | ... | ... | ... | ... |
| 1816 | Train_1816 | 0 | 0 | 0 | 1 |
| 1817 | Train_1817 | 1 | 0 | 0 | 0 |
| 1818 | Train_1818 | 1 | 0 | 0 | 0 |
| 1819 | Train_1819 | 0 | 0 | 1 | 0 |
| 1820 | Train_1820 | 0 | 0 | 0 | 1 |
1821 rows × 5 columns
# Storing the images paths and respective labels
def images_paths(img_name, global_path=all_images_loc):
"""
Description : This function is created for generating the path of every given dataset image.
Inputs : It accepts below 2 parameters:
- img_name : Image name
- global_path : Parent folder path
Returns : It returns the complete path of every given image.
"""
return os.path.join(global_path, img_name + '.jpg')
# All train images paths
all_train_paths = train_images_info_df['image_id'].apply(images_paths).values
# All test images paths
all_test_paths = test_images_info_df['image_id'].apply(images_paths).values
# All train images tgt labels
all_train_labels = np.float32(train_images_info_df.loc[:, 'healthy':'scab'].values)
# TRAIN and TEST Split
train_paths, test_paths, train_labels, test_labels = train_test_split(all_train_paths,
all_train_labels,
test_size=0.20,
random_state=2022,
stratify=all_train_labels)
# Checking shapes of all images paths and labels
all_train_paths.shape, all_train_labels.shape, all_test_paths.shape
((1821,), (1821, 4), (1821,))
# Checking shapes of all TRAIN & TEST images paths and labels
train_paths.shape, train_labels.shape, test_paths.shape, test_labels.shape
((1456,), (1456, 4), (365,), (365, 4))
# Defining the tgt classes
cols = ['Healthy', 'Multiple_Diseases', 'Rust', 'Scab']
# Getting images name of TEST set
test_imgs_names = []
for path in test_paths:
test_imgs_names.append(path.split("/")[-1])
# Creating DataFrame of images names and tgt labels
test_imgs_names_df = pd.DataFrame(test_imgs_names, columns=['TEST_Imgs_Name'])
test_imgs_lbls_df = pd.DataFrame(test_labels, columns=cols)
# Merging the aboe dataframes
test_imgs_names_lbls_df = pd.concat([test_imgs_names_df, test_imgs_lbls_df], axis=1)
test_imgs_names_lbls_df
| TEST_Imgs_Name | Healthy | Multiple_Diseases | Rust | Scab | |
|---|---|---|---|---|---|
| 0 | Train_1196.jpg | 0.0 | 0.0 | 1.0 | 0.0 |
| 1 | Train_778.jpg | 0.0 | 0.0 | 1.0 | 0.0 |
| 2 | Train_1119.jpg | 0.0 | 1.0 | 0.0 | 0.0 |
| 3 | Train_624.jpg | 0.0 | 0.0 | 0.0 | 1.0 |
| 4 | Train_770.jpg | 0.0 | 0.0 | 0.0 | 1.0 |
| ... | ... | ... | ... | ... | ... |
| 360 | Train_113.jpg | 0.0 | 1.0 | 0.0 | 0.0 |
| 361 | Train_1573.jpg | 0.0 | 1.0 | 0.0 | 0.0 |
| 362 | Train_1100.jpg | 0.0 | 0.0 | 0.0 | 1.0 |
| 363 | Train_452.jpg | 1.0 | 0.0 | 0.0 | 0.0 |
| 364 | Train_754.jpg | 0.0 | 1.0 | 0.0 | 0.0 |
365 rows × 5 columns
train_labels_df = pd.DataFrame(train_labels, columns=cols)
test_labels_df = pd.DataFrame(test_labels, columns=cols)
# Images representing HEALTHY in TRAIN
tr_hl_lbls_counts = pd.DataFrame(train_labels_df['Healthy'].value_counts())
tr_hl_lbls_counts
| Healthy | |
|---|---|
| 0.0 | 1043 |
| 1.0 | 413 |
# Images representing MULTIPLE_DISEASES in TRAIN
tr_md_lbls_counts = pd.DataFrame(train_labels_df['Multiple_Diseases'].value_counts())
tr_md_lbls_counts
| Multiple_Diseases | |
|---|---|
| 0.0 | 1383 |
| 1.0 | 73 |
# Images representing RUSTY in TRAIN
tr_rusty_lbls_counts = pd.DataFrame(train_labels_df['Rust'].value_counts())
tr_rusty_lbls_counts
| Rust | |
|---|---|
| 0.0 | 959 |
| 1.0 | 497 |
# Images representing SCABY in TRAIN
tr_scaby_lbls_counts = pd.DataFrame(train_labels_df['Scab'].value_counts())
tr_scaby_lbls_counts
| Scab | |
|---|---|
| 0.0 | 983 |
| 1.0 | 473 |
# TRAIN TGT labels distribution
train_tgt_classes_dist = pd.concat([tr_hl_lbls_counts, tr_md_lbls_counts, tr_rusty_lbls_counts, tr_scaby_lbls_counts], axis=1)
train_tgt_classes_dist
| Healthy | Multiple_Diseases | Rust | Scab | |
|---|---|---|---|---|
| 0.0 | 1043 | 1383 | 959 | 983 |
| 1.0 | 413 | 73 | 497 | 473 |
OBSERVATION
Multiple Diseases are very less in number as compared to the others.# Images representing HEALTHY in TEST set
test_hl_lbls_counts = pd.DataFrame(test_labels_df['Healthy'].value_counts())
test_hl_lbls_counts
| Healthy | |
|---|---|
| 0.0 | 262 |
| 1.0 | 103 |
# Images representing MULTIPLE_DISEASES in TEST set
test_md_lbls_counts = pd.DataFrame(test_labels_df['Multiple_Diseases'].value_counts())
test_md_lbls_counts
| Multiple_Diseases | |
|---|---|
| 0.0 | 347 |
| 1.0 | 18 |
# Images representing RUSTY in TEST set
test_rusty_lbls_counts = pd.DataFrame(test_labels_df['Rust'].value_counts())
test_rusty_lbls_counts
| Rust | |
|---|---|
| 0.0 | 240 |
| 1.0 | 125 |
# Images representing SCABY in TEST set
test_scaby_lbls_counts = pd.DataFrame(test_labels_df['Scab'].value_counts())
test_scaby_lbls_counts
| Scab | |
|---|---|
| 0.0 | 246 |
| 1.0 | 119 |
# TEST set TGT labels distribution
test_tgt_classes_dist = pd.concat([test_hl_lbls_counts, test_md_lbls_counts, test_rusty_lbls_counts, test_scaby_lbls_counts], axis=1)
test_tgt_classes_dist
| Healthy | Multiple_Diseases | Rust | Scab | |
|---|---|---|---|---|
| 0.0 | 262 | 347 | 240 | 246 |
| 1.0 | 103 | 18 | 125 | 119 |
OBSERVATION
Multiple Diseases are very less(only 18) in number as compared to the others.# Function to read all the images
def reading_images(imgs_paths, size=(224,224)):
"""
Description : This function will read all the given images and store them in a list.
"""
# List container for storing all the images after reading, resizing & normalization
images = []
for index, img_path in enumerate(imgs_paths):
# Reading the image
img = cv2.imread(img_path)
# Changing the color to RGB model
img = cv2.cvtColor(img, cv2.COLOR_BGR2RGB)
# Resizing the image
img = cv2.resize(img, size)
# Normalizing the pixel values
img = img / 255.0
# Storing every image
images.append(img)
return images
# Reading all the TRAIN images in a single list
all_train_images = reading_images(imgs_paths=train_paths)
# Matching the count of existing TRAIN images before-after reading them
print("### Total TRAIN Images & Labels available after splitting are ===> {} and {} ###".format(len(all_train_images),
train_labels.shape[0]))
### Total TRAIN Images & Labels available after splitting are ===> 1456 and 1456 ###
# Checking the shape of train image after reading it
print("### The shape of train image after reading it is ===> {} ###".format(all_train_images[0].shape))
### The shape of train image after reading it is ===> (224, 224, 3) ###
# Reading all the TEST images in a single list
all_test_images = reading_images(imgs_paths=test_paths)
# Matching the count of existing TEST images before-after reading them
print("### Total TEST Images & Labels available after splitting are ===> {} and {} ###".format(len(all_test_images),
test_labels.shape[0]))
### Total TEST Images & Labels available after splitting are ===> 365 and 365 ###
# Checking the shape of test image after reading it
print("### The shape of test image after reading it is ===> {} ###".format(all_test_images[0].shape))
### The shape of test image after reading it is ===> (224, 224, 3) ###
# Storing X_test and y_test for evaluation of models
X_test = np.asarray(all_test_images)
y_test = test_labels
# Checking the dtype & shape of test set after reading it
print("### The dtype & shape of test set after reading it is ===> {}, {}, {}, {} ###".format(type(X_test), type(y_test), X_test.shape, y_test.shape))
### The dtype & shape of test set after reading it is ===> <class 'numpy.ndarray'>, <class 'numpy.ndarray'>, (365, 224, 224, 3), (365, 4) ###
test_labels.shape
(365, 4)
# Defining the directory for storing the augmented images
aug_imgs_path = pathlib.Path(os.path.join(os.path.join(os.path.join(os.getcwd(),"Dataset"),"Pathology_2020"), "Augmented_Images"))
aug_imgs_path
PosixPath('/content/Dataset/Pathology_2020/Augmented_Images')
# Creating the folder in the drive for storing the Augmented Images
if "Augmented_Images" not in os.listdir(os.path.join(os.path.join(os.getcwd(),"Dataset"),"Pathology_2020")):
os.mkdir(aug_imgs_path)
else:
None
def data_augmentation(image):
"""
Description : This function is created for performing the augmentations on an image to yield a different image.
Inputs : It accepts below parameter:
- img_name : Image name
Returns : It returns the tensor object of an augmented image.
"""
# Adjusting the hue of the RGB image
image = tf.image.adjust_hue(image, delta=0.02)
# Adjusting the Brightness of the RGB image
image = tf.image.adjust_brightness(image, delta=0.05)
# Reducing the noise using Gaussian filter
image = tfa.image.gaussian_filter2d(image, filter_shape=(16,16))
return image
# Defining the IMAGE DATA GENERATOR for only the TRAIN SET
train_img_gen = tf.keras.preprocessing.image.ImageDataGenerator(rotation_range=0.3,
zoom_range=0.2,
horizontal_flip=True,
vertical_flip=True,
shear_range=0.25,
width_shift_range=0.15,
height_shift_range=0.15,
preprocessing_function=data_augmentation,
rescale=None)
Image_Augmentation¶def generate_aug_images(readed_images, labels, data_gen, aug_img_dir, size=(224,224), no_of_aug_images=9):
"""
Description : This function is created for generating & storing(both in-drive & memory) the augmented images.
"""
images_after_aug = []
labels_after_aug = []
for index, image in enumerate(readed_images):
for i in range(no_of_aug_images):
img = data_gen.flow(x=np.reshape(image, (1, size[0], size[1], 3)),
batch_size=32,
shuffle=True,
sample_weight=None,
save_to_dir=aug_imgs_path,
save_prefix="Aug_Train",
save_format='jpg').next()
images_after_aug.append(np.reshape(img, (size[0], size[1], 3)))
labels_after_aug.append(train_labels[index])
aug_labels = np.asarray(labels_after_aug)
aug_images = np.asarray(images_after_aug)
return aug_images, aug_labels
# Generating the augmented images of only 1 image
augmented_images1, augmented_labels1 = generate_aug_images(all_train_images[0:1],
labels=train_labels,
data_gen=train_img_gen,
aug_img_dir=aug_imgs_path)
# Checking the shapes of augmented images & labels array
augmented_images1.shape, augmented_labels1.shape
((9, 224, 224, 3), (9, 4))
# Checking the image of 0th index augmented image
augmented_images1[0][0].shape
(224, 3)
def plot_aug_images(org_images, aug_images):
"""
Description : This function is created for visualizing the original and its augmented images.
"""
with plt.style.context('seaborn'):
fig, ax = plt.subplots(nrows=2, ncols=5, figsize=(16,8), sharex=True, sharey=True)
ax[0,0].imshow(org_images[0], aspect='auto')
ax[0,0].set_title("Original", fontdict= label_font_dict)
ax[0,0].axis("off")
r = 0;
c = 1;
for i in range(9):
ax[r,c].imshow(aug_images[i], aspect='auto')
ax[r,c].set_title("Aug Image - {}".format(i+1), fontdict= label_font_dict)
ax[r,c].axis("off")
c +=1;
if c >= 5:
r = 1;
c = 0;
plt.show();
plot_aug_images(org_images=all_train_images, aug_images=augmented_images1)
WARNING:matplotlib.image:Clipping input data to the valid range for imshow with RGB data ([0..1] for floats or [0..255] for integers).
# Generating the augmented images of only 1 image
augmented_images2, augmented_labels2 = generate_aug_images(all_train_images[9:10],
labels=train_labels,
data_gen=train_img_gen,
aug_img_dir=aug_imgs_path)
# Checking the shapes of augmented images & labels array
augmented_images2.shape, augmented_labels2.shape
((9, 224, 224, 3), (9, 4))
# Checking the image of 0th index augmented image
augmented_images2[0][0].shape
(224, 3)
plot_aug_images(org_images=all_train_images[9:10], aug_images=augmented_images2)
WARNING:matplotlib.image:Clipping input data to the valid range for imshow with RGB data ([0..1] for floats or [0..255] for integers). WARNING:matplotlib.image:Clipping input data to the valid range for imshow with RGB data ([0..1] for floats or [0..255] for integers). WARNING:matplotlib.image:Clipping input data to the valid range for imshow with RGB data ([0..1] for floats or [0..255] for integers). WARNING:matplotlib.image:Clipping input data to the valid range for imshow with RGB data ([0..1] for floats or [0..255] for integers). WARNING:matplotlib.image:Clipping input data to the valid range for imshow with RGB data ([0..1] for floats or [0..255] for integers). WARNING:matplotlib.image:Clipping input data to the valid range for imshow with RGB data ([0..1] for floats or [0..255] for integers). WARNING:matplotlib.image:Clipping input data to the valid range for imshow with RGB data ([0..1] for floats or [0..255] for integers). WARNING:matplotlib.image:Clipping input data to the valid range for imshow with RGB data ([0..1] for floats or [0..255] for integers). WARNING:matplotlib.image:Clipping input data to the valid range for imshow with RGB data ([0..1] for floats or [0..255] for integers).
# Generating the augmented images of only 1 image
augmented_images3, augmented_labels3 = generate_aug_images(all_train_images[49:50],
labels=train_labels,
data_gen=train_img_gen,
aug_img_dir=aug_imgs_path)
# Checking the shapes of augmented images & labels array
augmented_images3.shape, augmented_labels3.shape
((9, 224, 224, 3), (9, 4))
# Checking the image of 0th index augmented image
augmented_images3[0][0].shape
(224, 3)
plot_aug_images(org_images=all_train_images[49:50], aug_images=augmented_images3)
WARNING:matplotlib.image:Clipping input data to the valid range for imshow with RGB data ([0..1] for floats or [0..255] for integers). WARNING:matplotlib.image:Clipping input data to the valid range for imshow with RGB data ([0..1] for floats or [0..255] for integers). WARNING:matplotlib.image:Clipping input data to the valid range for imshow with RGB data ([0..1] for floats or [0..255] for integers). WARNING:matplotlib.image:Clipping input data to the valid range for imshow with RGB data ([0..1] for floats or [0..255] for integers). WARNING:matplotlib.image:Clipping input data to the valid range for imshow with RGB data ([0..1] for floats or [0..255] for integers). WARNING:matplotlib.image:Clipping input data to the valid range for imshow with RGB data ([0..1] for floats or [0..255] for integers). WARNING:matplotlib.image:Clipping input data to the valid range for imshow with RGB data ([0..1] for floats or [0..255] for integers). WARNING:matplotlib.image:Clipping input data to the valid range for imshow with RGB data ([0..1] for floats or [0..255] for integers). WARNING:matplotlib.image:Clipping input data to the valid range for imshow with RGB data ([0..1] for floats or [0..255] for integers).
# Generating the augmented images of all train images
augmented_images, augmented_labels = generate_aug_images(all_train_images,
labels=train_labels,
data_gen=train_img_gen,
aug_img_dir=aug_imgs_path)
# Checking the shapes of augmented images & labels array
augmented_images.shape, augmented_labels.shape
((13104, 224, 224, 3), (13104, 4))
# Checking the image of 0th index augmented image
augmented_images[0][0].shape
(224, 3)
# Train and Validation Split
X_train, X_val, y_train, y_val = train_test_split(augmented_images, augmented_labels, test_size = 0.15, random_state = 44)
# Shape of TRAIN & VALIDATION sets after augmentation
X_train.shape, X_val.shape, y_train.shape, y_val.shape
((11138, 224, 224, 3), (1966, 224, 224, 3), (11138, 4), (1966, 4))
y_train
array([[0., 0., 0., 1.],
[0., 0., 1., 0.],
[0., 0., 0., 1.],
...,
[1., 0., 0., 0.],
[0., 0., 0., 1.],
[1., 0., 0., 0.]], dtype=float32)
y_val
array([[0., 0., 0., 1.],
[0., 0., 0., 1.],
[0., 0., 1., 0.],
...,
[0., 0., 1., 0.],
[1., 0., 0., 0.],
[0., 0., 0., 1.]], dtype=float32)
type(X_train), type(y_train), type(X_val), type(y_val)
(numpy.ndarray, numpy.ndarray, numpy.ndarray, numpy.ndarray)
# Saving the pre-processed TRAIN, VALIDATION & TEST sets
np.savez("X_train.npz",X_train)
np.savez("y_train.npz",y_train)
np.savez("X_val.npz",X_val)
np.savez("y_val.npz",y_val)
np.savez("X_test.npz",X_test)
np.savez("y_test.npz",y_test)
# mount the drive
from google.colab import drive
drive.mount('/content/drive')
Mounted at /content/drive
# copy it there
!cp X_train.npz /content/drive/MyDrive/AAIC_Case_Study_2
!cp y_train.npz /content/drive/MyDrive/AAIC_Case_Study_2
!cp X_val.npz /content/drive/MyDrive/AAIC_Case_Study_2
!cp y_val.npz /content/drive/MyDrive/AAIC_Case_Study_2
!cp X_test.npz /content/drive/MyDrive/AAIC_Case_Study_2
!cp y_test.npz /content/drive/MyDrive/AAIC_Case_Study_2
# Downloading the X_train, y_train, X_val, y_val, X_test, y_test files
!gdown 1-0u3ptDTaldA1VhN-RolsS4NCDbFOUES
!gdown 1-3Ex9EW35XQ9ImO_-dZtoMezumtlZlG1
!gdown 1-5-COzsDSvvlaYiXOVbqFD3FknDXpT3I
!gdown 1-5Upw5HC1kdmiuQ-4jP_vY8sJ-lelmHi
!gdown 1_2-6Jiu_GzC0H_3CWmAoBVyzZygT-JzM
!gdown 1dsigc8-o36jGW_2UGGPyLr2_JRUDG-xB
Downloading... From: https://drive.google.com/uc?id=1-0u3ptDTaldA1VhN-RolsS4NCDbFOUES To: /content/X_val.npz 100% 1.18G/1.18G [00:05<00:00, 228MB/s] Downloading... From: https://drive.google.com/uc?id=1-3Ex9EW35XQ9ImO_-dZtoMezumtlZlG1 To: /content/y_val.npz 100% 31.7k/31.7k [00:00<00:00, 33.6MB/s] Downloading... From: https://drive.google.com/uc?id=1-5-COzsDSvvlaYiXOVbqFD3FknDXpT3I To: /content/X_test.npz 100% 440M/440M [00:02<00:00, 189MB/s] Downloading... From: https://drive.google.com/uc?id=1-5Upw5HC1kdmiuQ-4jP_vY8sJ-lelmHi To: /content/y_test.npz 100% 6.10k/6.10k [00:00<00:00, 10.6MB/s] Downloading... From: https://drive.google.com/uc?id=1_2-6Jiu_GzC0H_3CWmAoBVyzZygT-JzM To: /content/X_train.npz 100% 6.71G/6.71G [00:30<00:00, 217MB/s] Downloading... From: https://drive.google.com/uc?id=1dsigc8-o36jGW_2UGGPyLr2_JRUDG-xB To: /content/y_train.npz 100% 178k/178k [00:00<00:00, 87.9MB/s]
# Loading the X_train, y_train, X_val, y_val, X_test & y_test files in memory
x_train_file = np.load("X_train.npz")
X_train = x_train_file.f.arr_0
y_train_file = np.load("y_train.npz")
y_train = y_train_file.f.arr_0
x_val_file = np.load("X_val.npz")
X_val = x_val_file.f.arr_0
y_val_file = np.load("y_val.npz")
y_val = y_val_file.f.arr_0
x_test_file = np.load("X_test.npz")
X_test = x_test_file.f.arr_0
y_test_file = np.load("y_test.npz")
y_test = y_test_file.f.arr_0
# Saving the memory;
del x_train_file;
del y_train_file;
del x_val_file;
del y_val_file;
del x_test_file;
del y_test_file;
# Checking types of TRAIN, VAL & TEST sets
type(X_train), type(y_train), type(X_val), type(y_val), type(X_test), type(y_test)
(numpy.ndarray, numpy.ndarray, numpy.ndarray, numpy.ndarray, numpy.ndarray, numpy.ndarray)
# Checking shapes of TRAIN, VAL & TEST sets
X_train.shape, y_train.shape, X_val.shape, y_val.shape, X_test.shape, y_test.shape
((11138, 224, 224, 3), (11138, 4), (1966, 224, 224, 3), (1966, 4), (365, 224, 224, 3), (365, 4))
train_labels_df = pd.DataFrame(y_train, columns=cols)
val_labels_df = pd.DataFrame(y_val, columns=cols)
# Images representing HEALTHY in TRAIN
tr_hl_lbls_counts = pd.DataFrame(train_labels_df['Healthy'].value_counts())
tr_hl_lbls_counts
| Healthy | |
|---|---|
| 0.0 | 7996 |
| 1.0 | 3142 |
# Images representing MULTIPLE_DISEASES in TRAIN
tr_md_lbls_counts = pd.DataFrame(train_labels_df['Multiple_Diseases'].value_counts())
tr_md_lbls_counts
| Multiple_Diseases | |
|---|---|
| 0.0 | 10576 |
| 1.0 | 562 |
# Images representing RUSTY in TRAIN
tr_rusty_lbls_counts = pd.DataFrame(train_labels_df['Rust'].value_counts())
tr_rusty_lbls_counts
| Rust | |
|---|---|
| 0.0 | 7314 |
| 1.0 | 3824 |
# Images representing SCABY in TRAIN
tr_scaby_lbls_counts = pd.DataFrame(train_labels_df['Scab'].value_counts())
tr_scaby_lbls_counts
| Scab | |
|---|---|
| 0.0 | 7528 |
| 1.0 | 3610 |
# TRAIN TGT labels distribution
train_tgt_classes_dist = pd.concat([tr_hl_lbls_counts, tr_md_lbls_counts, tr_rusty_lbls_counts, tr_scaby_lbls_counts], axis=1)
train_tgt_classes_dist
| Healthy | Multiple_Diseases | Rust | Scab | |
|---|---|---|---|---|
| 0.0 | 7996 | 10576 | 7314 | 7528 |
| 1.0 | 3142 | 562 | 3824 | 3610 |
OBSERVATION
Multiple Diseases are very less in number as compared to the others.# Images representing HEALTHY in VALIDATION set
val_hl_lbls_counts = pd.DataFrame(val_labels_df['Healthy'].value_counts())
val_hl_lbls_counts
| Healthy | |
|---|---|
| 0.0 | 1391 |
| 1.0 | 575 |
# Images representing MULTIPLE_DISEASES in VALIDATION set
val_md_lbls_counts = pd.DataFrame(val_labels_df['Multiple_Diseases'].value_counts())
val_md_lbls_counts
| Multiple_Diseases | |
|---|---|
| 0.0 | 1871 |
| 1.0 | 95 |
# Images representing RUSTY in VALIDATION set
val_rusty_lbls_counts = pd.DataFrame(val_labels_df['Rust'].value_counts())
val_rusty_lbls_counts
| Rust | |
|---|---|
| 0.0 | 1317 |
| 1.0 | 649 |
# Images representing SCABY in VALIDATION set
val_scaby_lbls_counts = pd.DataFrame(val_labels_df['Scab'].value_counts())
val_scaby_lbls_counts
| Scab | |
|---|---|
| 0.0 | 1319 |
| 1.0 | 647 |
# VALIDATION set TGT labels distribution
val_tgt_classes_dist = pd.concat([val_hl_lbls_counts, val_md_lbls_counts, val_rusty_lbls_counts, val_scaby_lbls_counts], axis=1)
val_tgt_classes_dist
| Healthy | Multiple_Diseases | Rust | Scab | |
|---|---|---|---|---|
| 0.0 | 1391 | 1871 | 1317 | 1319 |
| 1.0 | 575 | 95 | 649 | 647 |
OBSERVATION
Multiple Diseases are very less(only 95) in number as compared to the others.# Saving the memory;
del augmented_images;
del augmented_labels;
del augmented_images1;
del augmented_images2;
del augmented_images3;
del augmented_labels1;
del augmented_labels2;
del augmented_labels3;
Calculating_Class_Weights¶y_train.shape
(11138, 4)
cols = ['Healthy', 'Multiple_Diseases', 'Rust', 'Scab']
tmp_cw = pd.DataFrame(y_train, columns=cols)
tmp_cw
| Healthy | Multiple_Diseases | Rust | Scab | |
|---|---|---|---|---|
| 0 | 0.0 | 0.0 | 0.0 | 1.0 |
| 1 | 0.0 | 0.0 | 1.0 | 0.0 |
| 2 | 0.0 | 0.0 | 0.0 | 1.0 |
| 3 | 0.0 | 0.0 | 1.0 | 0.0 |
| 4 | 0.0 | 0.0 | 0.0 | 1.0 |
| ... | ... | ... | ... | ... |
| 11133 | 1.0 | 0.0 | 0.0 | 0.0 |
| 11134 | 1.0 | 0.0 | 0.0 | 0.0 |
| 11135 | 1.0 | 0.0 | 0.0 | 0.0 |
| 11136 | 0.0 | 0.0 | 0.0 | 1.0 |
| 11137 | 1.0 | 0.0 | 0.0 | 0.0 |
11138 rows × 4 columns
def class_label(hl, md, rs, sc):
"""
Description : This function is created for assigning the class label tag.
"""
if hl == 1:
return 0
elif md == 1:
return 1
elif rs == 1:
return 2
elif sc == 1:
return 3
tmp_cw['label'] = tmp_cw[['Healthy', 'Multiple_Diseases', 'Rust', 'Scab']].apply(lambda row: class_label(row['Healthy'],
row['Multiple_Diseases'],
row['Rust'],
row['Scab']), axis=1)
np.unique(tmp_cw['label'].values)
array([0, 1, 2, 3])
# Calculating the class weights
cw1 = class_weight.compute_class_weight('balanced', classes = np.unique(tmp_cw['label'].values), y=tmp_cw['label'].values)
cw1
array([0.88621897, 4.95462633, 0.72816423, 0.77132964])
# Storing the weights as a dict
cw1_dict = {0:cw1[0], 1:cw1[1], 2:cw1[2], 3:cw1[3]}
cw1_dict
{0: 0.8862189688096753,
1: 4.954626334519573,
2: 0.7281642259414226,
3: 0.7713296398891967}
Defining_Performance_Metrics¶# Declaring the metrics
tfa_f1_scr = tfa.metrics.F1Score(num_classes=4, average='macro')
# Defining the AUC Score method
def auc_score(y_true, y_pred):
"""
Description : This function is created for returning ROC AUC Score.
"""
if len(np.unique(y_true[:,1])) == 1:
return 0.5
else:
return roc_auc_score(y_true, y_pred)
# Calculating the AUC score
def auc(y_true, y_pred):
"""
Description : This function is created for defining the ROC AUC function as a TF function for graph computation.
"""
score = tf.py_function(auc_score, [y_true, y_pred], 'float32', name='sklearnAUC')
return score
cols = ['Healthy', 'Multiple_Diseases', 'Rust', 'Scab'];
# Generating the multi-class confusion matrix for seeing the classification results
def confusion_matrix_(actual_labels, dataset, model, BATCH_SIZE=32):
"""
Description : This function is created for generating the confusion matrix for all the tgt classes.
"""
# Model predictions
y_pred = model.predict(dataset, batch_size=BATCH_SIZE)
# Storing the predictions in the form of 0 or 1
for i in range(y_pred.shape[0]):
y_pred[i] = np.where(y_pred[i] == y_pred[i].max(), 1, 0)
# Storing the actual labels for all tgt classes
actual_healthy = np.choose([0], actual_labels.T)
actual_multiple_diseases = np.choose([1], actual_labels.T)
actual_rust = np.choose([2], actual_labels.T)
actual_scab = np.choose([3], actual_labels.T)
# Storing the predicted labels for all tgt classes
predicted_healthy = np.choose([0], y_pred.T)
predicted_multiple_diseases = np.choose([1], y_pred.T)
predicted_rust = np.choose([2], y_pred.T)
predicted_scab = np.choose([3], y_pred.T)
# Generating the perf metrics score
## Accuracy
calc_acc = tf.keras.metrics.BinaryAccuracy()
calc_acc.update_state(actual_healthy, predicted_healthy)
acc_healthy = calc_acc.result().numpy()
calc_acc.update_state(actual_multiple_diseases, predicted_multiple_diseases)
acc_md = calc_acc.result().numpy()
calc_acc.update_state(actual_rust, predicted_rust)
acc_rust = calc_acc.result().numpy()
calc_acc.update_state(actual_scab, predicted_scab)
acc_scab = calc_acc.result().numpy()
acc_df = pd.DataFrame(np.array([acc_healthy, acc_md, acc_rust, acc_scab])).T
acc_df.columns = cols
acc_df.index = ["BINARY Accuracy"]
## Precision
calc_prec = tf.keras.metrics.Precision()
calc_prec.update_state(actual_healthy, predicted_healthy)
prec_healthy = calc_prec.result().numpy()
calc_prec.update_state(actual_multiple_diseases, predicted_multiple_diseases)
prec_md = calc_prec.result().numpy()
calc_prec.update_state(actual_rust, predicted_rust)
prec_rust = calc_prec.result().numpy()
calc_prec.update_state(actual_scab, predicted_scab)
prec_scab = calc_prec.result().numpy()
prec_df = pd.DataFrame(np.array([prec_healthy, prec_md, prec_rust, prec_scab])).T
prec_df.columns = cols
prec_df.index = ["Precision"]
## Recall
calc_rec = tf.keras.metrics.Recall()
calc_rec.update_state(actual_healthy, predicted_healthy)
rec_healthy = calc_rec.result().numpy()
calc_rec.update_state(actual_multiple_diseases, predicted_multiple_diseases)
rec_md = calc_rec.result().numpy()
calc_rec.update_state(actual_rust, predicted_rust)
rec_rust = calc_rec.result().numpy()
calc_rec.update_state(actual_scab, predicted_scab)
rec_scab = calc_rec.result().numpy()
rec_df = pd.DataFrame(np.array([rec_healthy, rec_md, rec_rust, rec_scab])).T
rec_df.columns = cols
rec_df.index = ["Recall"]
## F1 Score
f1_scr_hl = f1_score(actual_healthy, predicted_healthy, average='macro')
f1_scr_md = f1_score(actual_multiple_diseases, predicted_multiple_diseases, average='macro')
f1_scr_rs = f1_score(actual_rust, predicted_rust, average='macro')
f1_scr_sc = f1_score(actual_scab, predicted_scab, average='macro')
f1_scr_df = pd.DataFrame(np.array([f1_scr_hl, f1_scr_md, f1_scr_rs, f1_scr_sc])).T
f1_scr_df.columns = cols
f1_scr_df.index = ["Macro F1 Score"]
## ROC AUC Score
roc_auc_scr_hl = roc_auc_score(actual_healthy, predicted_healthy, average='macro')
roc_auc_scr_md = roc_auc_score(actual_multiple_diseases, predicted_multiple_diseases, average='macro')
roc_auc_scr_rs = roc_auc_score(actual_rust, predicted_rust, average='macro')
roc_auc_scr_sc = roc_auc_score(actual_scab, predicted_scab, average='macro')
roc_auc_scr_df = pd.DataFrame(np.array([roc_auc_scr_hl, roc_auc_scr_md, roc_auc_scr_rs, roc_auc_scr_sc])).T
roc_auc_scr_df.columns = cols
roc_auc_scr_df.index = ["Macro ROC AUC Score"]
## Final Results
results_df = pd.concat([acc_df, prec_df, rec_df, f1_scr_df, roc_auc_scr_df], axis=0)
results_df = results_df.applymap(lambda val: np.round(val,4))
# Plotting the confusion matrix
with plt.style.context('seaborn-poster'):
fig, ax = plt.subplots(nrows=2, ncols=2, figsize=(16,14), sharex=False, sharey=False)
# CM for healthy class
cm1 = confusion_matrix(actual_healthy, predicted_healthy,labels=[0,1])
sns.heatmap(cm1, annot=True, cmap="plasma", fmt=".3f", xticklabels = ['Not Healthy','Healthy'], yticklabels = ['Not Healthy','Healthy'], cbar=False, ax=ax[0,0])
ax[0,0].set_xlabel('Predicted', fontdict=label_font_dict)
ax[0,0].set_ylabel('Actual', fontdict=label_font_dict)
ax0_xticks = ax[0,0].get_yticklabels()
ax0_yticks = ax[0,0].get_yticklabels()
ax[0,0].set_xticklabels(labels = ax0_xticks, fontdict=ticks_font_dict)
ax[0,0].set_yticklabels(labels = ax0_xticks, fontdict=ticks_font_dict)
ax[0,0].set_title("Healthy", fontdict=title_font_dict)
# CM for mutiple diseases class
cm2 = confusion_matrix(actual_multiple_diseases, predicted_multiple_diseases,labels=[0,1])
sns.heatmap(cm2, annot=True, cmap="plasma", fmt=".3f", xticklabels = ['No M.D','M.D'], yticklabels = ['No M.D','M.D'], cbar=False, ax=ax[0,1])
ax[0,1].set_xlabel('Predicted', fontdict=label_font_dict)
ax[0,1].set_ylabel('Actual', fontdict=label_font_dict)
ax1_xticks = ax[0,1].get_yticklabels()
ax1_yticks = ax[0,1].get_yticklabels()
ax[0,1].set_xticklabels(labels = ax1_xticks, fontdict=ticks_font_dict)
ax[0,1].set_yticklabels(labels = ax1_xticks, fontdict=ticks_font_dict)
ax[0,1].set_title("Multiple Diseases", fontdict=title_font_dict)
# CM for rust class
cm3 = confusion_matrix(actual_rust, predicted_rust,labels=[0,1])
sns.heatmap(cm3, annot=True, cmap="plasma", fmt=".3f", xticklabels = ['No Rust','Rust'], yticklabels = ['No Rust','Rust'], cbar=False, ax=ax[1,0])
ax[1,0].set_xlabel('Predicted', fontdict=label_font_dict)
ax[1,0].set_ylabel('Actual', fontdict=label_font_dict)
ax2_xticks = ax[1,0].get_yticklabels()
ax2_yticks = ax[1,0].get_yticklabels()
ax[1,0].set_xticklabels(labels = ax2_xticks, fontdict=ticks_font_dict)
ax[1,0].set_yticklabels(labels = ax2_xticks, fontdict=ticks_font_dict)
ax[1,0].set_title("Rust", fontdict=title_font_dict)
# CM for scab class
cm4 = confusion_matrix(actual_scab, predicted_scab,labels=[0,1])
sns.heatmap(cm4, annot=True, cmap="plasma", fmt=".3f", xticklabels = ['No Scab','Scab'], yticklabels = ['No Scab','Scab'], cbar=False, ax=ax[1,1])
ax[1,1].set_xlabel('Predicted', fontdict=label_font_dict)
ax[1,1].set_ylabel('Actual', fontdict=label_font_dict)
ax3_xticks = ax[1,1].get_yticklabels()
ax3_yticks = ax[1,1].get_yticklabels()
ax[1,1].set_xticklabels(labels = ax3_xticks, fontdict=ticks_font_dict)
ax[1,1].set_yticklabels(labels = ax3_xticks, fontdict=ticks_font_dict)
ax[1,1].set_title("Scab", fontdict=title_font_dict)
plt.show()
return results_df
# Loading the tensorboard
%load_ext tensorboard
# Folder storing all the runs logs
root_logdir = os.path.join(os.path.join(os.curdir, "logs"), "fit")
# Get current run logs directory
def get_run_logdir():
run_id = time.strftime("run_%Y_%m_%d-%H_%M_%S")
return os.path.join(root_logdir, run_id)
Models_Training_Configuration¶# Defining the batch-size
BATCH_SIZE = 32
# Using the AUTOTUNE algo for better allocation of computing resources for Dataset pipeline
AUTO = tf.data.experimental.AUTOTUNE
Models_Set_1¶A.Global_Tuning---Custom_TopLayers---ImageNet_Weights¶A1.ResNet---50¶# build the ResNet-50 network
resnet50_with_no_top_model = tf.keras.applications.ResNet50(include_top=False, weights='imagenet', input_shape=(224,224,3))
# Model summary
resnet50_with_no_top_model.summary()
Model: "resnet50"
__________________________________________________________________________________________________
Layer (type) Output Shape Param # Connected to
==================================================================================================
input_1 (InputLayer) [(None, 224, 224, 3 0 []
)]
conv1_pad (ZeroPadding2D) (None, 230, 230, 3) 0 ['input_1[0][0]']
conv1_conv (Conv2D) (None, 112, 112, 64 9472 ['conv1_pad[0][0]']
)
conv1_bn (BatchNormalization) (None, 112, 112, 64 256 ['conv1_conv[0][0]']
)
conv1_relu (Activation) (None, 112, 112, 64 0 ['conv1_bn[0][0]']
)
pool1_pad (ZeroPadding2D) (None, 114, 114, 64 0 ['conv1_relu[0][0]']
)
pool1_pool (MaxPooling2D) (None, 56, 56, 64) 0 ['pool1_pad[0][0]']
conv2_block1_1_conv (Conv2D) (None, 56, 56, 64) 4160 ['pool1_pool[0][0]']
conv2_block1_1_bn (BatchNormal (None, 56, 56, 64) 256 ['conv2_block1_1_conv[0][0]']
ization)
conv2_block1_1_relu (Activatio (None, 56, 56, 64) 0 ['conv2_block1_1_bn[0][0]']
n)
conv2_block1_2_conv (Conv2D) (None, 56, 56, 64) 36928 ['conv2_block1_1_relu[0][0]']
conv2_block1_2_bn (BatchNormal (None, 56, 56, 64) 256 ['conv2_block1_2_conv[0][0]']
ization)
conv2_block1_2_relu (Activatio (None, 56, 56, 64) 0 ['conv2_block1_2_bn[0][0]']
n)
conv2_block1_0_conv (Conv2D) (None, 56, 56, 256) 16640 ['pool1_pool[0][0]']
conv2_block1_3_conv (Conv2D) (None, 56, 56, 256) 16640 ['conv2_block1_2_relu[0][0]']
conv2_block1_0_bn (BatchNormal (None, 56, 56, 256) 1024 ['conv2_block1_0_conv[0][0]']
ization)
conv2_block1_3_bn (BatchNormal (None, 56, 56, 256) 1024 ['conv2_block1_3_conv[0][0]']
ization)
conv2_block1_add (Add) (None, 56, 56, 256) 0 ['conv2_block1_0_bn[0][0]',
'conv2_block1_3_bn[0][0]']
conv2_block1_out (Activation) (None, 56, 56, 256) 0 ['conv2_block1_add[0][0]']
conv2_block2_1_conv (Conv2D) (None, 56, 56, 64) 16448 ['conv2_block1_out[0][0]']
conv2_block2_1_bn (BatchNormal (None, 56, 56, 64) 256 ['conv2_block2_1_conv[0][0]']
ization)
conv2_block2_1_relu (Activatio (None, 56, 56, 64) 0 ['conv2_block2_1_bn[0][0]']
n)
conv2_block2_2_conv (Conv2D) (None, 56, 56, 64) 36928 ['conv2_block2_1_relu[0][0]']
conv2_block2_2_bn (BatchNormal (None, 56, 56, 64) 256 ['conv2_block2_2_conv[0][0]']
ization)
conv2_block2_2_relu (Activatio (None, 56, 56, 64) 0 ['conv2_block2_2_bn[0][0]']
n)
conv2_block2_3_conv (Conv2D) (None, 56, 56, 256) 16640 ['conv2_block2_2_relu[0][0]']
conv2_block2_3_bn (BatchNormal (None, 56, 56, 256) 1024 ['conv2_block2_3_conv[0][0]']
ization)
conv2_block2_add (Add) (None, 56, 56, 256) 0 ['conv2_block1_out[0][0]',
'conv2_block2_3_bn[0][0]']
conv2_block2_out (Activation) (None, 56, 56, 256) 0 ['conv2_block2_add[0][0]']
conv2_block3_1_conv (Conv2D) (None, 56, 56, 64) 16448 ['conv2_block2_out[0][0]']
conv2_block3_1_bn (BatchNormal (None, 56, 56, 64) 256 ['conv2_block3_1_conv[0][0]']
ization)
conv2_block3_1_relu (Activatio (None, 56, 56, 64) 0 ['conv2_block3_1_bn[0][0]']
n)
conv2_block3_2_conv (Conv2D) (None, 56, 56, 64) 36928 ['conv2_block3_1_relu[0][0]']
conv2_block3_2_bn (BatchNormal (None, 56, 56, 64) 256 ['conv2_block3_2_conv[0][0]']
ization)
conv2_block3_2_relu (Activatio (None, 56, 56, 64) 0 ['conv2_block3_2_bn[0][0]']
n)
conv2_block3_3_conv (Conv2D) (None, 56, 56, 256) 16640 ['conv2_block3_2_relu[0][0]']
conv2_block3_3_bn (BatchNormal (None, 56, 56, 256) 1024 ['conv2_block3_3_conv[0][0]']
ization)
conv2_block3_add (Add) (None, 56, 56, 256) 0 ['conv2_block2_out[0][0]',
'conv2_block3_3_bn[0][0]']
conv2_block3_out (Activation) (None, 56, 56, 256) 0 ['conv2_block3_add[0][0]']
conv3_block1_1_conv (Conv2D) (None, 28, 28, 128) 32896 ['conv2_block3_out[0][0]']
conv3_block1_1_bn (BatchNormal (None, 28, 28, 128) 512 ['conv3_block1_1_conv[0][0]']
ization)
conv3_block1_1_relu (Activatio (None, 28, 28, 128) 0 ['conv3_block1_1_bn[0][0]']
n)
conv3_block1_2_conv (Conv2D) (None, 28, 28, 128) 147584 ['conv3_block1_1_relu[0][0]']
conv3_block1_2_bn (BatchNormal (None, 28, 28, 128) 512 ['conv3_block1_2_conv[0][0]']
ization)
conv3_block1_2_relu (Activatio (None, 28, 28, 128) 0 ['conv3_block1_2_bn[0][0]']
n)
conv3_block1_0_conv (Conv2D) (None, 28, 28, 512) 131584 ['conv2_block3_out[0][0]']
conv3_block1_3_conv (Conv2D) (None, 28, 28, 512) 66048 ['conv3_block1_2_relu[0][0]']
conv3_block1_0_bn (BatchNormal (None, 28, 28, 512) 2048 ['conv3_block1_0_conv[0][0]']
ization)
conv3_block1_3_bn (BatchNormal (None, 28, 28, 512) 2048 ['conv3_block1_3_conv[0][0]']
ization)
conv3_block1_add (Add) (None, 28, 28, 512) 0 ['conv3_block1_0_bn[0][0]',
'conv3_block1_3_bn[0][0]']
conv3_block1_out (Activation) (None, 28, 28, 512) 0 ['conv3_block1_add[0][0]']
conv3_block2_1_conv (Conv2D) (None, 28, 28, 128) 65664 ['conv3_block1_out[0][0]']
conv3_block2_1_bn (BatchNormal (None, 28, 28, 128) 512 ['conv3_block2_1_conv[0][0]']
ization)
conv3_block2_1_relu (Activatio (None, 28, 28, 128) 0 ['conv3_block2_1_bn[0][0]']
n)
conv3_block2_2_conv (Conv2D) (None, 28, 28, 128) 147584 ['conv3_block2_1_relu[0][0]']
conv3_block2_2_bn (BatchNormal (None, 28, 28, 128) 512 ['conv3_block2_2_conv[0][0]']
ization)
conv3_block2_2_relu (Activatio (None, 28, 28, 128) 0 ['conv3_block2_2_bn[0][0]']
n)
conv3_block2_3_conv (Conv2D) (None, 28, 28, 512) 66048 ['conv3_block2_2_relu[0][0]']
conv3_block2_3_bn (BatchNormal (None, 28, 28, 512) 2048 ['conv3_block2_3_conv[0][0]']
ization)
conv3_block2_add (Add) (None, 28, 28, 512) 0 ['conv3_block1_out[0][0]',
'conv3_block2_3_bn[0][0]']
conv3_block2_out (Activation) (None, 28, 28, 512) 0 ['conv3_block2_add[0][0]']
conv3_block3_1_conv (Conv2D) (None, 28, 28, 128) 65664 ['conv3_block2_out[0][0]']
conv3_block3_1_bn (BatchNormal (None, 28, 28, 128) 512 ['conv3_block3_1_conv[0][0]']
ization)
conv3_block3_1_relu (Activatio (None, 28, 28, 128) 0 ['conv3_block3_1_bn[0][0]']
n)
conv3_block3_2_conv (Conv2D) (None, 28, 28, 128) 147584 ['conv3_block3_1_relu[0][0]']
conv3_block3_2_bn (BatchNormal (None, 28, 28, 128) 512 ['conv3_block3_2_conv[0][0]']
ization)
conv3_block3_2_relu (Activatio (None, 28, 28, 128) 0 ['conv3_block3_2_bn[0][0]']
n)
conv3_block3_3_conv (Conv2D) (None, 28, 28, 512) 66048 ['conv3_block3_2_relu[0][0]']
conv3_block3_3_bn (BatchNormal (None, 28, 28, 512) 2048 ['conv3_block3_3_conv[0][0]']
ization)
conv3_block3_add (Add) (None, 28, 28, 512) 0 ['conv3_block2_out[0][0]',
'conv3_block3_3_bn[0][0]']
conv3_block3_out (Activation) (None, 28, 28, 512) 0 ['conv3_block3_add[0][0]']
conv3_block4_1_conv (Conv2D) (None, 28, 28, 128) 65664 ['conv3_block3_out[0][0]']
conv3_block4_1_bn (BatchNormal (None, 28, 28, 128) 512 ['conv3_block4_1_conv[0][0]']
ization)
conv3_block4_1_relu (Activatio (None, 28, 28, 128) 0 ['conv3_block4_1_bn[0][0]']
n)
conv3_block4_2_conv (Conv2D) (None, 28, 28, 128) 147584 ['conv3_block4_1_relu[0][0]']
conv3_block4_2_bn (BatchNormal (None, 28, 28, 128) 512 ['conv3_block4_2_conv[0][0]']
ization)
conv3_block4_2_relu (Activatio (None, 28, 28, 128) 0 ['conv3_block4_2_bn[0][0]']
n)
conv3_block4_3_conv (Conv2D) (None, 28, 28, 512) 66048 ['conv3_block4_2_relu[0][0]']
conv3_block4_3_bn (BatchNormal (None, 28, 28, 512) 2048 ['conv3_block4_3_conv[0][0]']
ization)
conv3_block4_add (Add) (None, 28, 28, 512) 0 ['conv3_block3_out[0][0]',
'conv3_block4_3_bn[0][0]']
conv3_block4_out (Activation) (None, 28, 28, 512) 0 ['conv3_block4_add[0][0]']
conv4_block1_1_conv (Conv2D) (None, 14, 14, 256) 131328 ['conv3_block4_out[0][0]']
conv4_block1_1_bn (BatchNormal (None, 14, 14, 256) 1024 ['conv4_block1_1_conv[0][0]']
ization)
conv4_block1_1_relu (Activatio (None, 14, 14, 256) 0 ['conv4_block1_1_bn[0][0]']
n)
conv4_block1_2_conv (Conv2D) (None, 14, 14, 256) 590080 ['conv4_block1_1_relu[0][0]']
conv4_block1_2_bn (BatchNormal (None, 14, 14, 256) 1024 ['conv4_block1_2_conv[0][0]']
ization)
conv4_block1_2_relu (Activatio (None, 14, 14, 256) 0 ['conv4_block1_2_bn[0][0]']
n)
conv4_block1_0_conv (Conv2D) (None, 14, 14, 1024 525312 ['conv3_block4_out[0][0]']
)
conv4_block1_3_conv (Conv2D) (None, 14, 14, 1024 263168 ['conv4_block1_2_relu[0][0]']
)
conv4_block1_0_bn (BatchNormal (None, 14, 14, 1024 4096 ['conv4_block1_0_conv[0][0]']
ization) )
conv4_block1_3_bn (BatchNormal (None, 14, 14, 1024 4096 ['conv4_block1_3_conv[0][0]']
ization) )
conv4_block1_add (Add) (None, 14, 14, 1024 0 ['conv4_block1_0_bn[0][0]',
) 'conv4_block1_3_bn[0][0]']
conv4_block1_out (Activation) (None, 14, 14, 1024 0 ['conv4_block1_add[0][0]']
)
conv4_block2_1_conv (Conv2D) (None, 14, 14, 256) 262400 ['conv4_block1_out[0][0]']
conv4_block2_1_bn (BatchNormal (None, 14, 14, 256) 1024 ['conv4_block2_1_conv[0][0]']
ization)
conv4_block2_1_relu (Activatio (None, 14, 14, 256) 0 ['conv4_block2_1_bn[0][0]']
n)
conv4_block2_2_conv (Conv2D) (None, 14, 14, 256) 590080 ['conv4_block2_1_relu[0][0]']
conv4_block2_2_bn (BatchNormal (None, 14, 14, 256) 1024 ['conv4_block2_2_conv[0][0]']
ization)
conv4_block2_2_relu (Activatio (None, 14, 14, 256) 0 ['conv4_block2_2_bn[0][0]']
n)
conv4_block2_3_conv (Conv2D) (None, 14, 14, 1024 263168 ['conv4_block2_2_relu[0][0]']
)
conv4_block2_3_bn (BatchNormal (None, 14, 14, 1024 4096 ['conv4_block2_3_conv[0][0]']
ization) )
conv4_block2_add (Add) (None, 14, 14, 1024 0 ['conv4_block1_out[0][0]',
) 'conv4_block2_3_bn[0][0]']
conv4_block2_out (Activation) (None, 14, 14, 1024 0 ['conv4_block2_add[0][0]']
)
conv4_block3_1_conv (Conv2D) (None, 14, 14, 256) 262400 ['conv4_block2_out[0][0]']
conv4_block3_1_bn (BatchNormal (None, 14, 14, 256) 1024 ['conv4_block3_1_conv[0][0]']
ization)
conv4_block3_1_relu (Activatio (None, 14, 14, 256) 0 ['conv4_block3_1_bn[0][0]']
n)
conv4_block3_2_conv (Conv2D) (None, 14, 14, 256) 590080 ['conv4_block3_1_relu[0][0]']
conv4_block3_2_bn (BatchNormal (None, 14, 14, 256) 1024 ['conv4_block3_2_conv[0][0]']
ization)
conv4_block3_2_relu (Activatio (None, 14, 14, 256) 0 ['conv4_block3_2_bn[0][0]']
n)
conv4_block3_3_conv (Conv2D) (None, 14, 14, 1024 263168 ['conv4_block3_2_relu[0][0]']
)
conv4_block3_3_bn (BatchNormal (None, 14, 14, 1024 4096 ['conv4_block3_3_conv[0][0]']
ization) )
conv4_block3_add (Add) (None, 14, 14, 1024 0 ['conv4_block2_out[0][0]',
) 'conv4_block3_3_bn[0][0]']
conv4_block3_out (Activation) (None, 14, 14, 1024 0 ['conv4_block3_add[0][0]']
)
conv4_block4_1_conv (Conv2D) (None, 14, 14, 256) 262400 ['conv4_block3_out[0][0]']
conv4_block4_1_bn (BatchNormal (None, 14, 14, 256) 1024 ['conv4_block4_1_conv[0][0]']
ization)
conv4_block4_1_relu (Activatio (None, 14, 14, 256) 0 ['conv4_block4_1_bn[0][0]']
n)
conv4_block4_2_conv (Conv2D) (None, 14, 14, 256) 590080 ['conv4_block4_1_relu[0][0]']
conv4_block4_2_bn (BatchNormal (None, 14, 14, 256) 1024 ['conv4_block4_2_conv[0][0]']
ization)
conv4_block4_2_relu (Activatio (None, 14, 14, 256) 0 ['conv4_block4_2_bn[0][0]']
n)
conv4_block4_3_conv (Conv2D) (None, 14, 14, 1024 263168 ['conv4_block4_2_relu[0][0]']
)
conv4_block4_3_bn (BatchNormal (None, 14, 14, 1024 4096 ['conv4_block4_3_conv[0][0]']
ization) )
conv4_block4_add (Add) (None, 14, 14, 1024 0 ['conv4_block3_out[0][0]',
) 'conv4_block4_3_bn[0][0]']
conv4_block4_out (Activation) (None, 14, 14, 1024 0 ['conv4_block4_add[0][0]']
)
conv4_block5_1_conv (Conv2D) (None, 14, 14, 256) 262400 ['conv4_block4_out[0][0]']
conv4_block5_1_bn (BatchNormal (None, 14, 14, 256) 1024 ['conv4_block5_1_conv[0][0]']
ization)
conv4_block5_1_relu (Activatio (None, 14, 14, 256) 0 ['conv4_block5_1_bn[0][0]']
n)
conv4_block5_2_conv (Conv2D) (None, 14, 14, 256) 590080 ['conv4_block5_1_relu[0][0]']
conv4_block5_2_bn (BatchNormal (None, 14, 14, 256) 1024 ['conv4_block5_2_conv[0][0]']
ization)
conv4_block5_2_relu (Activatio (None, 14, 14, 256) 0 ['conv4_block5_2_bn[0][0]']
n)
conv4_block5_3_conv (Conv2D) (None, 14, 14, 1024 263168 ['conv4_block5_2_relu[0][0]']
)
conv4_block5_3_bn (BatchNormal (None, 14, 14, 1024 4096 ['conv4_block5_3_conv[0][0]']
ization) )
conv4_block5_add (Add) (None, 14, 14, 1024 0 ['conv4_block4_out[0][0]',
) 'conv4_block5_3_bn[0][0]']
conv4_block5_out (Activation) (None, 14, 14, 1024 0 ['conv4_block5_add[0][0]']
)
conv4_block6_1_conv (Conv2D) (None, 14, 14, 256) 262400 ['conv4_block5_out[0][0]']
conv4_block6_1_bn (BatchNormal (None, 14, 14, 256) 1024 ['conv4_block6_1_conv[0][0]']
ization)
conv4_block6_1_relu (Activatio (None, 14, 14, 256) 0 ['conv4_block6_1_bn[0][0]']
n)
conv4_block6_2_conv (Conv2D) (None, 14, 14, 256) 590080 ['conv4_block6_1_relu[0][0]']
conv4_block6_2_bn (BatchNormal (None, 14, 14, 256) 1024 ['conv4_block6_2_conv[0][0]']
ization)
conv4_block6_2_relu (Activatio (None, 14, 14, 256) 0 ['conv4_block6_2_bn[0][0]']
n)
conv4_block6_3_conv (Conv2D) (None, 14, 14, 1024 263168 ['conv4_block6_2_relu[0][0]']
)
conv4_block6_3_bn (BatchNormal (None, 14, 14, 1024 4096 ['conv4_block6_3_conv[0][0]']
ization) )
conv4_block6_add (Add) (None, 14, 14, 1024 0 ['conv4_block5_out[0][0]',
) 'conv4_block6_3_bn[0][0]']
conv4_block6_out (Activation) (None, 14, 14, 1024 0 ['conv4_block6_add[0][0]']
)
conv5_block1_1_conv (Conv2D) (None, 7, 7, 512) 524800 ['conv4_block6_out[0][0]']
conv5_block1_1_bn (BatchNormal (None, 7, 7, 512) 2048 ['conv5_block1_1_conv[0][0]']
ization)
conv5_block1_1_relu (Activatio (None, 7, 7, 512) 0 ['conv5_block1_1_bn[0][0]']
n)
conv5_block1_2_conv (Conv2D) (None, 7, 7, 512) 2359808 ['conv5_block1_1_relu[0][0]']
conv5_block1_2_bn (BatchNormal (None, 7, 7, 512) 2048 ['conv5_block1_2_conv[0][0]']
ization)
conv5_block1_2_relu (Activatio (None, 7, 7, 512) 0 ['conv5_block1_2_bn[0][0]']
n)
conv5_block1_0_conv (Conv2D) (None, 7, 7, 2048) 2099200 ['conv4_block6_out[0][0]']
conv5_block1_3_conv (Conv2D) (None, 7, 7, 2048) 1050624 ['conv5_block1_2_relu[0][0]']
conv5_block1_0_bn (BatchNormal (None, 7, 7, 2048) 8192 ['conv5_block1_0_conv[0][0]']
ization)
conv5_block1_3_bn (BatchNormal (None, 7, 7, 2048) 8192 ['conv5_block1_3_conv[0][0]']
ization)
conv5_block1_add (Add) (None, 7, 7, 2048) 0 ['conv5_block1_0_bn[0][0]',
'conv5_block1_3_bn[0][0]']
conv5_block1_out (Activation) (None, 7, 7, 2048) 0 ['conv5_block1_add[0][0]']
conv5_block2_1_conv (Conv2D) (None, 7, 7, 512) 1049088 ['conv5_block1_out[0][0]']
conv5_block2_1_bn (BatchNormal (None, 7, 7, 512) 2048 ['conv5_block2_1_conv[0][0]']
ization)
conv5_block2_1_relu (Activatio (None, 7, 7, 512) 0 ['conv5_block2_1_bn[0][0]']
n)
conv5_block2_2_conv (Conv2D) (None, 7, 7, 512) 2359808 ['conv5_block2_1_relu[0][0]']
conv5_block2_2_bn (BatchNormal (None, 7, 7, 512) 2048 ['conv5_block2_2_conv[0][0]']
ization)
conv5_block2_2_relu (Activatio (None, 7, 7, 512) 0 ['conv5_block2_2_bn[0][0]']
n)
conv5_block2_3_conv (Conv2D) (None, 7, 7, 2048) 1050624 ['conv5_block2_2_relu[0][0]']
conv5_block2_3_bn (BatchNormal (None, 7, 7, 2048) 8192 ['conv5_block2_3_conv[0][0]']
ization)
conv5_block2_add (Add) (None, 7, 7, 2048) 0 ['conv5_block1_out[0][0]',
'conv5_block2_3_bn[0][0]']
conv5_block2_out (Activation) (None, 7, 7, 2048) 0 ['conv5_block2_add[0][0]']
conv5_block3_1_conv (Conv2D) (None, 7, 7, 512) 1049088 ['conv5_block2_out[0][0]']
conv5_block3_1_bn (BatchNormal (None, 7, 7, 512) 2048 ['conv5_block3_1_conv[0][0]']
ization)
conv5_block3_1_relu (Activatio (None, 7, 7, 512) 0 ['conv5_block3_1_bn[0][0]']
n)
conv5_block3_2_conv (Conv2D) (None, 7, 7, 512) 2359808 ['conv5_block3_1_relu[0][0]']
conv5_block3_2_bn (BatchNormal (None, 7, 7, 512) 2048 ['conv5_block3_2_conv[0][0]']
ization)
conv5_block3_2_relu (Activatio (None, 7, 7, 512) 0 ['conv5_block3_2_bn[0][0]']
n)
conv5_block3_3_conv (Conv2D) (None, 7, 7, 2048) 1050624 ['conv5_block3_2_relu[0][0]']
conv5_block3_3_bn (BatchNormal (None, 7, 7, 2048) 8192 ['conv5_block3_3_conv[0][0]']
ization)
conv5_block3_add (Add) (None, 7, 7, 2048) 0 ['conv5_block2_out[0][0]',
'conv5_block3_3_bn[0][0]']
conv5_block3_out (Activation) (None, 7, 7, 2048) 0 ['conv5_block3_add[0][0]']
==================================================================================================
Total params: 23,587,712
Trainable params: 23,534,592
Non-trainable params: 53,120
__________________________________________________________________________________________________
# Instantiating Optimizer
learning_rate= 0.0001
opt1 = tf.keras.optimizers.Adam(learning_rate=learning_rate)
# Reduce Learning Rate on Plateau
reduce_lr1 = tf.keras.callbacks.ReduceLROnPlateau(monitor='val_loss', factor=0.1, patience=2, verbose=1, mode='auto', min_delta=0.0001)
# Logs directory
curr_run_logdir1 = get_run_logdir()
# Instantiating Tensorboard callback
tensorboard_callback1 = TensorBoard(log_dir=curr_run_logdir1, histogram_freq=1)
# Setting the seed
os.environ['PYTHONHASHSEED'] = '0'
# Clearing the TF session
tf.keras.backend.clear_session()
# defining the custom top of the ResNet-50 model
for layer in resnet50_with_no_top_model.layers:
layer.trainable = True
# Adding additional layers
input_layer = resnet50_with_no_top_model.output
# Defining the top layers structure of the model
flatten = tf.keras.layers.GlobalAveragePooling2D(name='Flatten_for_hidden_layers')(input_layer)
dropout_1 = Dropout(rate=0.5, name='Dropout1')(flatten)
dense_layer1 = tf.keras.layers.Dense(units=128,
activation='relu',
use_bias=True,
kernel_initializer=tf.keras.initializers.he_normal(seed=80),
bias_initializer=tf.keras.initializers.he_normal(seed=110),
name='Hidden_Layer1')(dropout_1)
dropout_2 = Dropout(rate=0.5, name='Dropout2')(dense_layer1)
dense_layer2 = tf.keras.layers.Dense(units=128,
activation='relu',
use_bias=True,
kernel_initializer=tf.keras.initializers.he_normal(seed=80),
bias_initializer=tf.keras.initializers.he_normal(seed=110),
name='Hidden_Layer2')(dropout_2)
dropout_3 = Dropout(rate=0.5, name='Dropout3')(dense_layer2)
dense_layer3 = tf.keras.layers.Dense(units=128,
activation='relu',
use_bias=True,
kernel_initializer=tf.keras.initializers.he_normal(seed=80),
bias_initializer=tf.keras.initializers.he_normal(seed=110),
name='Hidden_Layer3')(dropout_3)
output_layer = tf.keras.layers.Dense(4, activation='softmax', name="output")(dense_layer3)
# Instantiating the complete model
resnet_50 = Model(inputs=resnet50_with_no_top_model.input, outputs=output_layer)
# Compiling the model
resnet_50.compile(optimizer=opt1,
loss = 'categorical_crossentropy',
metrics=['categorical_accuracy', tfa_f1_scr])
# Summary of the ResNet-50 model with custom top
resnet_50.summary()
Model: "model"
__________________________________________________________________________________________________
Layer (type) Output Shape Param # Connected to
==================================================================================================
input_1 (InputLayer) [(None, 224, 224, 3 0 []
)]
conv1_pad (ZeroPadding2D) (None, 230, 230, 3) 0 ['input_1[0][0]']
conv1_conv (Conv2D) (None, 112, 112, 64 9472 ['conv1_pad[0][0]']
)
conv1_bn (BatchNormalization) (None, 112, 112, 64 256 ['conv1_conv[0][0]']
)
conv1_relu (Activation) (None, 112, 112, 64 0 ['conv1_bn[0][0]']
)
pool1_pad (ZeroPadding2D) (None, 114, 114, 64 0 ['conv1_relu[0][0]']
)
pool1_pool (MaxPooling2D) (None, 56, 56, 64) 0 ['pool1_pad[0][0]']
conv2_block1_1_conv (Conv2D) (None, 56, 56, 64) 4160 ['pool1_pool[0][0]']
conv2_block1_1_bn (BatchNormal (None, 56, 56, 64) 256 ['conv2_block1_1_conv[0][0]']
ization)
conv2_block1_1_relu (Activatio (None, 56, 56, 64) 0 ['conv2_block1_1_bn[0][0]']
n)
conv2_block1_2_conv (Conv2D) (None, 56, 56, 64) 36928 ['conv2_block1_1_relu[0][0]']
conv2_block1_2_bn (BatchNormal (None, 56, 56, 64) 256 ['conv2_block1_2_conv[0][0]']
ization)
conv2_block1_2_relu (Activatio (None, 56, 56, 64) 0 ['conv2_block1_2_bn[0][0]']
n)
conv2_block1_0_conv (Conv2D) (None, 56, 56, 256) 16640 ['pool1_pool[0][0]']
conv2_block1_3_conv (Conv2D) (None, 56, 56, 256) 16640 ['conv2_block1_2_relu[0][0]']
conv2_block1_0_bn (BatchNormal (None, 56, 56, 256) 1024 ['conv2_block1_0_conv[0][0]']
ization)
conv2_block1_3_bn (BatchNormal (None, 56, 56, 256) 1024 ['conv2_block1_3_conv[0][0]']
ization)
conv2_block1_add (Add) (None, 56, 56, 256) 0 ['conv2_block1_0_bn[0][0]',
'conv2_block1_3_bn[0][0]']
conv2_block1_out (Activation) (None, 56, 56, 256) 0 ['conv2_block1_add[0][0]']
conv2_block2_1_conv (Conv2D) (None, 56, 56, 64) 16448 ['conv2_block1_out[0][0]']
conv2_block2_1_bn (BatchNormal (None, 56, 56, 64) 256 ['conv2_block2_1_conv[0][0]']
ization)
conv2_block2_1_relu (Activatio (None, 56, 56, 64) 0 ['conv2_block2_1_bn[0][0]']
n)
conv2_block2_2_conv (Conv2D) (None, 56, 56, 64) 36928 ['conv2_block2_1_relu[0][0]']
conv2_block2_2_bn (BatchNormal (None, 56, 56, 64) 256 ['conv2_block2_2_conv[0][0]']
ization)
conv2_block2_2_relu (Activatio (None, 56, 56, 64) 0 ['conv2_block2_2_bn[0][0]']
n)
conv2_block2_3_conv (Conv2D) (None, 56, 56, 256) 16640 ['conv2_block2_2_relu[0][0]']
conv2_block2_3_bn (BatchNormal (None, 56, 56, 256) 1024 ['conv2_block2_3_conv[0][0]']
ization)
conv2_block2_add (Add) (None, 56, 56, 256) 0 ['conv2_block1_out[0][0]',
'conv2_block2_3_bn[0][0]']
conv2_block2_out (Activation) (None, 56, 56, 256) 0 ['conv2_block2_add[0][0]']
conv2_block3_1_conv (Conv2D) (None, 56, 56, 64) 16448 ['conv2_block2_out[0][0]']
conv2_block3_1_bn (BatchNormal (None, 56, 56, 64) 256 ['conv2_block3_1_conv[0][0]']
ization)
conv2_block3_1_relu (Activatio (None, 56, 56, 64) 0 ['conv2_block3_1_bn[0][0]']
n)
conv2_block3_2_conv (Conv2D) (None, 56, 56, 64) 36928 ['conv2_block3_1_relu[0][0]']
conv2_block3_2_bn (BatchNormal (None, 56, 56, 64) 256 ['conv2_block3_2_conv[0][0]']
ization)
conv2_block3_2_relu (Activatio (None, 56, 56, 64) 0 ['conv2_block3_2_bn[0][0]']
n)
conv2_block3_3_conv (Conv2D) (None, 56, 56, 256) 16640 ['conv2_block3_2_relu[0][0]']
conv2_block3_3_bn (BatchNormal (None, 56, 56, 256) 1024 ['conv2_block3_3_conv[0][0]']
ization)
conv2_block3_add (Add) (None, 56, 56, 256) 0 ['conv2_block2_out[0][0]',
'conv2_block3_3_bn[0][0]']
conv2_block3_out (Activation) (None, 56, 56, 256) 0 ['conv2_block3_add[0][0]']
conv3_block1_1_conv (Conv2D) (None, 28, 28, 128) 32896 ['conv2_block3_out[0][0]']
conv3_block1_1_bn (BatchNormal (None, 28, 28, 128) 512 ['conv3_block1_1_conv[0][0]']
ization)
conv3_block1_1_relu (Activatio (None, 28, 28, 128) 0 ['conv3_block1_1_bn[0][0]']
n)
conv3_block1_2_conv (Conv2D) (None, 28, 28, 128) 147584 ['conv3_block1_1_relu[0][0]']
conv3_block1_2_bn (BatchNormal (None, 28, 28, 128) 512 ['conv3_block1_2_conv[0][0]']
ization)
conv3_block1_2_relu (Activatio (None, 28, 28, 128) 0 ['conv3_block1_2_bn[0][0]']
n)
conv3_block1_0_conv (Conv2D) (None, 28, 28, 512) 131584 ['conv2_block3_out[0][0]']
conv3_block1_3_conv (Conv2D) (None, 28, 28, 512) 66048 ['conv3_block1_2_relu[0][0]']
conv3_block1_0_bn (BatchNormal (None, 28, 28, 512) 2048 ['conv3_block1_0_conv[0][0]']
ization)
conv3_block1_3_bn (BatchNormal (None, 28, 28, 512) 2048 ['conv3_block1_3_conv[0][0]']
ization)
conv3_block1_add (Add) (None, 28, 28, 512) 0 ['conv3_block1_0_bn[0][0]',
'conv3_block1_3_bn[0][0]']
conv3_block1_out (Activation) (None, 28, 28, 512) 0 ['conv3_block1_add[0][0]']
conv3_block2_1_conv (Conv2D) (None, 28, 28, 128) 65664 ['conv3_block1_out[0][0]']
conv3_block2_1_bn (BatchNormal (None, 28, 28, 128) 512 ['conv3_block2_1_conv[0][0]']
ization)
conv3_block2_1_relu (Activatio (None, 28, 28, 128) 0 ['conv3_block2_1_bn[0][0]']
n)
conv3_block2_2_conv (Conv2D) (None, 28, 28, 128) 147584 ['conv3_block2_1_relu[0][0]']
conv3_block2_2_bn (BatchNormal (None, 28, 28, 128) 512 ['conv3_block2_2_conv[0][0]']
ization)
conv3_block2_2_relu (Activatio (None, 28, 28, 128) 0 ['conv3_block2_2_bn[0][0]']
n)
conv3_block2_3_conv (Conv2D) (None, 28, 28, 512) 66048 ['conv3_block2_2_relu[0][0]']
conv3_block2_3_bn (BatchNormal (None, 28, 28, 512) 2048 ['conv3_block2_3_conv[0][0]']
ization)
conv3_block2_add (Add) (None, 28, 28, 512) 0 ['conv3_block1_out[0][0]',
'conv3_block2_3_bn[0][0]']
conv3_block2_out (Activation) (None, 28, 28, 512) 0 ['conv3_block2_add[0][0]']
conv3_block3_1_conv (Conv2D) (None, 28, 28, 128) 65664 ['conv3_block2_out[0][0]']
conv3_block3_1_bn (BatchNormal (None, 28, 28, 128) 512 ['conv3_block3_1_conv[0][0]']
ization)
conv3_block3_1_relu (Activatio (None, 28, 28, 128) 0 ['conv3_block3_1_bn[0][0]']
n)
conv3_block3_2_conv (Conv2D) (None, 28, 28, 128) 147584 ['conv3_block3_1_relu[0][0]']
conv3_block3_2_bn (BatchNormal (None, 28, 28, 128) 512 ['conv3_block3_2_conv[0][0]']
ization)
conv3_block3_2_relu (Activatio (None, 28, 28, 128) 0 ['conv3_block3_2_bn[0][0]']
n)
conv3_block3_3_conv (Conv2D) (None, 28, 28, 512) 66048 ['conv3_block3_2_relu[0][0]']
conv3_block3_3_bn (BatchNormal (None, 28, 28, 512) 2048 ['conv3_block3_3_conv[0][0]']
ization)
conv3_block3_add (Add) (None, 28, 28, 512) 0 ['conv3_block2_out[0][0]',
'conv3_block3_3_bn[0][0]']
conv3_block3_out (Activation) (None, 28, 28, 512) 0 ['conv3_block3_add[0][0]']
conv3_block4_1_conv (Conv2D) (None, 28, 28, 128) 65664 ['conv3_block3_out[0][0]']
conv3_block4_1_bn (BatchNormal (None, 28, 28, 128) 512 ['conv3_block4_1_conv[0][0]']
ization)
conv3_block4_1_relu (Activatio (None, 28, 28, 128) 0 ['conv3_block4_1_bn[0][0]']
n)
conv3_block4_2_conv (Conv2D) (None, 28, 28, 128) 147584 ['conv3_block4_1_relu[0][0]']
conv3_block4_2_bn (BatchNormal (None, 28, 28, 128) 512 ['conv3_block4_2_conv[0][0]']
ization)
conv3_block4_2_relu (Activatio (None, 28, 28, 128) 0 ['conv3_block4_2_bn[0][0]']
n)
conv3_block4_3_conv (Conv2D) (None, 28, 28, 512) 66048 ['conv3_block4_2_relu[0][0]']
conv3_block4_3_bn (BatchNormal (None, 28, 28, 512) 2048 ['conv3_block4_3_conv[0][0]']
ization)
conv3_block4_add (Add) (None, 28, 28, 512) 0 ['conv3_block3_out[0][0]',
'conv3_block4_3_bn[0][0]']
conv3_block4_out (Activation) (None, 28, 28, 512) 0 ['conv3_block4_add[0][0]']
conv4_block1_1_conv (Conv2D) (None, 14, 14, 256) 131328 ['conv3_block4_out[0][0]']
conv4_block1_1_bn (BatchNormal (None, 14, 14, 256) 1024 ['conv4_block1_1_conv[0][0]']
ization)
conv4_block1_1_relu (Activatio (None, 14, 14, 256) 0 ['conv4_block1_1_bn[0][0]']
n)
conv4_block1_2_conv (Conv2D) (None, 14, 14, 256) 590080 ['conv4_block1_1_relu[0][0]']
conv4_block1_2_bn (BatchNormal (None, 14, 14, 256) 1024 ['conv4_block1_2_conv[0][0]']
ization)
conv4_block1_2_relu (Activatio (None, 14, 14, 256) 0 ['conv4_block1_2_bn[0][0]']
n)
conv4_block1_0_conv (Conv2D) (None, 14, 14, 1024 525312 ['conv3_block4_out[0][0]']
)
conv4_block1_3_conv (Conv2D) (None, 14, 14, 1024 263168 ['conv4_block1_2_relu[0][0]']
)
conv4_block1_0_bn (BatchNormal (None, 14, 14, 1024 4096 ['conv4_block1_0_conv[0][0]']
ization) )
conv4_block1_3_bn (BatchNormal (None, 14, 14, 1024 4096 ['conv4_block1_3_conv[0][0]']
ization) )
conv4_block1_add (Add) (None, 14, 14, 1024 0 ['conv4_block1_0_bn[0][0]',
) 'conv4_block1_3_bn[0][0]']
conv4_block1_out (Activation) (None, 14, 14, 1024 0 ['conv4_block1_add[0][0]']
)
conv4_block2_1_conv (Conv2D) (None, 14, 14, 256) 262400 ['conv4_block1_out[0][0]']
conv4_block2_1_bn (BatchNormal (None, 14, 14, 256) 1024 ['conv4_block2_1_conv[0][0]']
ization)
conv4_block2_1_relu (Activatio (None, 14, 14, 256) 0 ['conv4_block2_1_bn[0][0]']
n)
conv4_block2_2_conv (Conv2D) (None, 14, 14, 256) 590080 ['conv4_block2_1_relu[0][0]']
conv4_block2_2_bn (BatchNormal (None, 14, 14, 256) 1024 ['conv4_block2_2_conv[0][0]']
ization)
conv4_block2_2_relu (Activatio (None, 14, 14, 256) 0 ['conv4_block2_2_bn[0][0]']
n)
conv4_block2_3_conv (Conv2D) (None, 14, 14, 1024 263168 ['conv4_block2_2_relu[0][0]']
)
conv4_block2_3_bn (BatchNormal (None, 14, 14, 1024 4096 ['conv4_block2_3_conv[0][0]']
ization) )
conv4_block2_add (Add) (None, 14, 14, 1024 0 ['conv4_block1_out[0][0]',
) 'conv4_block2_3_bn[0][0]']
conv4_block2_out (Activation) (None, 14, 14, 1024 0 ['conv4_block2_add[0][0]']
)
conv4_block3_1_conv (Conv2D) (None, 14, 14, 256) 262400 ['conv4_block2_out[0][0]']
conv4_block3_1_bn (BatchNormal (None, 14, 14, 256) 1024 ['conv4_block3_1_conv[0][0]']
ization)
conv4_block3_1_relu (Activatio (None, 14, 14, 256) 0 ['conv4_block3_1_bn[0][0]']
n)
conv4_block3_2_conv (Conv2D) (None, 14, 14, 256) 590080 ['conv4_block3_1_relu[0][0]']
conv4_block3_2_bn (BatchNormal (None, 14, 14, 256) 1024 ['conv4_block3_2_conv[0][0]']
ization)
conv4_block3_2_relu (Activatio (None, 14, 14, 256) 0 ['conv4_block3_2_bn[0][0]']
n)
conv4_block3_3_conv (Conv2D) (None, 14, 14, 1024 263168 ['conv4_block3_2_relu[0][0]']
)
conv4_block3_3_bn (BatchNormal (None, 14, 14, 1024 4096 ['conv4_block3_3_conv[0][0]']
ization) )
conv4_block3_add (Add) (None, 14, 14, 1024 0 ['conv4_block2_out[0][0]',
) 'conv4_block3_3_bn[0][0]']
conv4_block3_out (Activation) (None, 14, 14, 1024 0 ['conv4_block3_add[0][0]']
)
conv4_block4_1_conv (Conv2D) (None, 14, 14, 256) 262400 ['conv4_block3_out[0][0]']
conv4_block4_1_bn (BatchNormal (None, 14, 14, 256) 1024 ['conv4_block4_1_conv[0][0]']
ization)
conv4_block4_1_relu (Activatio (None, 14, 14, 256) 0 ['conv4_block4_1_bn[0][0]']
n)
conv4_block4_2_conv (Conv2D) (None, 14, 14, 256) 590080 ['conv4_block4_1_relu[0][0]']
conv4_block4_2_bn (BatchNormal (None, 14, 14, 256) 1024 ['conv4_block4_2_conv[0][0]']
ization)
conv4_block4_2_relu (Activatio (None, 14, 14, 256) 0 ['conv4_block4_2_bn[0][0]']
n)
conv4_block4_3_conv (Conv2D) (None, 14, 14, 1024 263168 ['conv4_block4_2_relu[0][0]']
)
conv4_block4_3_bn (BatchNormal (None, 14, 14, 1024 4096 ['conv4_block4_3_conv[0][0]']
ization) )
conv4_block4_add (Add) (None, 14, 14, 1024 0 ['conv4_block3_out[0][0]',
) 'conv4_block4_3_bn[0][0]']
conv4_block4_out (Activation) (None, 14, 14, 1024 0 ['conv4_block4_add[0][0]']
)
conv4_block5_1_conv (Conv2D) (None, 14, 14, 256) 262400 ['conv4_block4_out[0][0]']
conv4_block5_1_bn (BatchNormal (None, 14, 14, 256) 1024 ['conv4_block5_1_conv[0][0]']
ization)
conv4_block5_1_relu (Activatio (None, 14, 14, 256) 0 ['conv4_block5_1_bn[0][0]']
n)
conv4_block5_2_conv (Conv2D) (None, 14, 14, 256) 590080 ['conv4_block5_1_relu[0][0]']
conv4_block5_2_bn (BatchNormal (None, 14, 14, 256) 1024 ['conv4_block5_2_conv[0][0]']
ization)
conv4_block5_2_relu (Activatio (None, 14, 14, 256) 0 ['conv4_block5_2_bn[0][0]']
n)
conv4_block5_3_conv (Conv2D) (None, 14, 14, 1024 263168 ['conv4_block5_2_relu[0][0]']
)
conv4_block5_3_bn (BatchNormal (None, 14, 14, 1024 4096 ['conv4_block5_3_conv[0][0]']
ization) )
conv4_block5_add (Add) (None, 14, 14, 1024 0 ['conv4_block4_out[0][0]',
) 'conv4_block5_3_bn[0][0]']
conv4_block5_out (Activation) (None, 14, 14, 1024 0 ['conv4_block5_add[0][0]']
)
conv4_block6_1_conv (Conv2D) (None, 14, 14, 256) 262400 ['conv4_block5_out[0][0]']
conv4_block6_1_bn (BatchNormal (None, 14, 14, 256) 1024 ['conv4_block6_1_conv[0][0]']
ization)
conv4_block6_1_relu (Activatio (None, 14, 14, 256) 0 ['conv4_block6_1_bn[0][0]']
n)
conv4_block6_2_conv (Conv2D) (None, 14, 14, 256) 590080 ['conv4_block6_1_relu[0][0]']
conv4_block6_2_bn (BatchNormal (None, 14, 14, 256) 1024 ['conv4_block6_2_conv[0][0]']
ization)
conv4_block6_2_relu (Activatio (None, 14, 14, 256) 0 ['conv4_block6_2_bn[0][0]']
n)
conv4_block6_3_conv (Conv2D) (None, 14, 14, 1024 263168 ['conv4_block6_2_relu[0][0]']
)
conv4_block6_3_bn (BatchNormal (None, 14, 14, 1024 4096 ['conv4_block6_3_conv[0][0]']
ization) )
conv4_block6_add (Add) (None, 14, 14, 1024 0 ['conv4_block5_out[0][0]',
) 'conv4_block6_3_bn[0][0]']
conv4_block6_out (Activation) (None, 14, 14, 1024 0 ['conv4_block6_add[0][0]']
)
conv5_block1_1_conv (Conv2D) (None, 7, 7, 512) 524800 ['conv4_block6_out[0][0]']
conv5_block1_1_bn (BatchNormal (None, 7, 7, 512) 2048 ['conv5_block1_1_conv[0][0]']
ization)
conv5_block1_1_relu (Activatio (None, 7, 7, 512) 0 ['conv5_block1_1_bn[0][0]']
n)
conv5_block1_2_conv (Conv2D) (None, 7, 7, 512) 2359808 ['conv5_block1_1_relu[0][0]']
conv5_block1_2_bn (BatchNormal (None, 7, 7, 512) 2048 ['conv5_block1_2_conv[0][0]']
ization)
conv5_block1_2_relu (Activatio (None, 7, 7, 512) 0 ['conv5_block1_2_bn[0][0]']
n)
conv5_block1_0_conv (Conv2D) (None, 7, 7, 2048) 2099200 ['conv4_block6_out[0][0]']
conv5_block1_3_conv (Conv2D) (None, 7, 7, 2048) 1050624 ['conv5_block1_2_relu[0][0]']
conv5_block1_0_bn (BatchNormal (None, 7, 7, 2048) 8192 ['conv5_block1_0_conv[0][0]']
ization)
conv5_block1_3_bn (BatchNormal (None, 7, 7, 2048) 8192 ['conv5_block1_3_conv[0][0]']
ization)
conv5_block1_add (Add) (None, 7, 7, 2048) 0 ['conv5_block1_0_bn[0][0]',
'conv5_block1_3_bn[0][0]']
conv5_block1_out (Activation) (None, 7, 7, 2048) 0 ['conv5_block1_add[0][0]']
conv5_block2_1_conv (Conv2D) (None, 7, 7, 512) 1049088 ['conv5_block1_out[0][0]']
conv5_block2_1_bn (BatchNormal (None, 7, 7, 512) 2048 ['conv5_block2_1_conv[0][0]']
ization)
conv5_block2_1_relu (Activatio (None, 7, 7, 512) 0 ['conv5_block2_1_bn[0][0]']
n)
conv5_block2_2_conv (Conv2D) (None, 7, 7, 512) 2359808 ['conv5_block2_1_relu[0][0]']
conv5_block2_2_bn (BatchNormal (None, 7, 7, 512) 2048 ['conv5_block2_2_conv[0][0]']
ization)
conv5_block2_2_relu (Activatio (None, 7, 7, 512) 0 ['conv5_block2_2_bn[0][0]']
n)
conv5_block2_3_conv (Conv2D) (None, 7, 7, 2048) 1050624 ['conv5_block2_2_relu[0][0]']
conv5_block2_3_bn (BatchNormal (None, 7, 7, 2048) 8192 ['conv5_block2_3_conv[0][0]']
ization)
conv5_block2_add (Add) (None, 7, 7, 2048) 0 ['conv5_block1_out[0][0]',
'conv5_block2_3_bn[0][0]']
conv5_block2_out (Activation) (None, 7, 7, 2048) 0 ['conv5_block2_add[0][0]']
conv5_block3_1_conv (Conv2D) (None, 7, 7, 512) 1049088 ['conv5_block2_out[0][0]']
conv5_block3_1_bn (BatchNormal (None, 7, 7, 512) 2048 ['conv5_block3_1_conv[0][0]']
ization)
conv5_block3_1_relu (Activatio (None, 7, 7, 512) 0 ['conv5_block3_1_bn[0][0]']
n)
conv5_block3_2_conv (Conv2D) (None, 7, 7, 512) 2359808 ['conv5_block3_1_relu[0][0]']
conv5_block3_2_bn (BatchNormal (None, 7, 7, 512) 2048 ['conv5_block3_2_conv[0][0]']
ization)
conv5_block3_2_relu (Activatio (None, 7, 7, 512) 0 ['conv5_block3_2_bn[0][0]']
n)
conv5_block3_3_conv (Conv2D) (None, 7, 7, 2048) 1050624 ['conv5_block3_2_relu[0][0]']
conv5_block3_3_bn (BatchNormal (None, 7, 7, 2048) 8192 ['conv5_block3_3_conv[0][0]']
ization)
conv5_block3_add (Add) (None, 7, 7, 2048) 0 ['conv5_block2_out[0][0]',
'conv5_block3_3_bn[0][0]']
conv5_block3_out (Activation) (None, 7, 7, 2048) 0 ['conv5_block3_add[0][0]']
Flatten_for_hidden_layers (Glo (None, 2048) 0 ['conv5_block3_out[0][0]']
balAveragePooling2D)
Dropout1 (Dropout) (None, 2048) 0 ['Flatten_for_hidden_layers[0][0]
']
Hidden_Layer1 (Dense) (None, 128) 262272 ['Dropout1[0][0]']
Dropout2 (Dropout) (None, 128) 0 ['Hidden_Layer1[0][0]']
Hidden_Layer2 (Dense) (None, 128) 16512 ['Dropout2[0][0]']
Dropout3 (Dropout) (None, 128) 0 ['Hidden_Layer2[0][0]']
Hidden_Layer3 (Dense) (None, 128) 16512 ['Dropout3[0][0]']
output (Dense) (None, 4) 516 ['Hidden_Layer3[0][0]']
==================================================================================================
Total params: 23,883,524
Trainable params: 23,830,404
Non-trainable params: 53,120
__________________________________________________________________________________________________
# plotting the model
plot_model(resnet_50, to_file='resnet_50.png', show_shapes=True, show_layer_names=True)
tf.autograph.experimental.do_not_convert(func=None)
<function tensorflow.python.autograph.impl.api.do_not_convert(func=None)>
# Size of TRAIN & VALIDATION labels and BATCH SIZE
y_train.shape[0], BATCH_SIZE, y_val.shape[0]
(11138, 32, 1966)
# Calculating train steps
train_steps = y_train.shape[0] // BATCH_SIZE
train_steps
348
# Calculating test steps
valid_steps = y_val.shape[0] // BATCH_SIZE
valid_steps
61
cw1_dict
{0: 0.8862189688096753,
1: 4.954626334519573,
2: 0.7281642259414226,
3: 0.7713296398891967}
# Training the ResNet-50 model having custom top
history1 = resnet_50.fit(X_train, y_train,
epochs=16,
batch_size=BATCH_SIZE,
callbacks=[tensorboard_callback1, reduce_lr1],
steps_per_epoch=train_steps,
validation_steps=valid_steps,
validation_data=[X_val, y_val],
class_weight=cw1_dict,
verbose=1)
Epoch 1/16 348/348 [==============================] - 139s 351ms/step - loss: 1.4588 - categorical_accuracy: 0.5018 - f1_score: 0.4238 - val_loss: 50.9254 - val_categorical_accuracy: 0.2920 - val_f1_score: 0.1130 - lr: 1.0000e-04 Epoch 2/16 348/348 [==============================] - 123s 351ms/step - loss: 0.6099 - categorical_accuracy: 0.8066 - f1_score: 0.7246 - val_loss: 4.4785 - val_categorical_accuracy: 0.2905 - val_f1_score: 0.1174 - lr: 1.0000e-04 Epoch 3/16 348/348 [==============================] - 122s 351ms/step - loss: 0.3638 - categorical_accuracy: 0.8881 - f1_score: 0.8230 - val_loss: 0.8613 - val_categorical_accuracy: 0.8320 - val_f1_score: 0.7679 - lr: 1.0000e-04 Epoch 4/16 348/348 [==============================] - 122s 351ms/step - loss: 0.2624 - categorical_accuracy: 0.9225 - f1_score: 0.8720 - val_loss: 0.1750 - val_categorical_accuracy: 0.9411 - val_f1_score: 0.8905 - lr: 1.0000e-04 Epoch 5/16 348/348 [==============================] - 122s 352ms/step - loss: 0.1454 - categorical_accuracy: 0.9592 - f1_score: 0.9267 - val_loss: 0.1275 - val_categorical_accuracy: 0.9718 - val_f1_score: 0.9432 - lr: 1.0000e-04 Epoch 6/16 348/348 [==============================] - 122s 352ms/step - loss: 0.1524 - categorical_accuracy: 0.9610 - f1_score: 0.9288 - val_loss: 0.0615 - val_categorical_accuracy: 0.9826 - val_f1_score: 0.9594 - lr: 1.0000e-04 Epoch 7/16 348/348 [==============================] - 122s 351ms/step - loss: 0.1276 - categorical_accuracy: 0.9663 - f1_score: 0.9436 - val_loss: 0.3417 - val_categorical_accuracy: 0.8960 - val_f1_score: 0.8258 - lr: 1.0000e-04 Epoch 8/16 348/348 [==============================] - ETA: 0s - loss: 0.1241 - categorical_accuracy: 0.9685 - f1_score: 0.9438 Epoch 8: ReduceLROnPlateau reducing learning rate to 9.999999747378752e-06. 348/348 [==============================] - 122s 351ms/step - loss: 0.1241 - categorical_accuracy: 0.9685 - f1_score: 0.9438 - val_loss: 0.0755 - val_categorical_accuracy: 0.9790 - val_f1_score: 0.9505 - lr: 1.0000e-04 Epoch 9/16 348/348 [==============================] - 122s 351ms/step - loss: 0.0385 - categorical_accuracy: 0.9888 - f1_score: 0.9774 - val_loss: 0.0327 - val_categorical_accuracy: 0.9903 - val_f1_score: 0.9748 - lr: 1.0000e-05 Epoch 10/16 348/348 [==============================] - 122s 351ms/step - loss: 0.0217 - categorical_accuracy: 0.9946 - f1_score: 0.9906 - val_loss: 0.0279 - val_categorical_accuracy: 0.9944 - val_f1_score: 0.9877 - lr: 1.0000e-05 Epoch 11/16 348/348 [==============================] - 122s 351ms/step - loss: 0.0156 - categorical_accuracy: 0.9958 - f1_score: 0.9927 - val_loss: 0.0254 - val_categorical_accuracy: 0.9939 - val_f1_score: 0.9860 - lr: 1.0000e-05 Epoch 12/16 348/348 [==============================] - 122s 350ms/step - loss: 0.0279 - categorical_accuracy: 0.9930 - f1_score: 0.9897 - val_loss: 0.0380 - val_categorical_accuracy: 0.9903 - val_f1_score: 0.9781 - lr: 1.0000e-05 Epoch 13/16 348/348 [==============================] - ETA: 0s - loss: 0.0163 - categorical_accuracy: 0.9957 - f1_score: 0.9921 Epoch 13: ReduceLROnPlateau reducing learning rate to 9.999999747378752e-07. 348/348 [==============================] - 122s 351ms/step - loss: 0.0163 - categorical_accuracy: 0.9957 - f1_score: 0.9921 - val_loss: 0.0307 - val_categorical_accuracy: 0.9908 - val_f1_score: 0.9795 - lr: 1.0000e-05 Epoch 14/16 348/348 [==============================] - 122s 351ms/step - loss: 0.0122 - categorical_accuracy: 0.9965 - f1_score: 0.9948 - val_loss: 0.0283 - val_categorical_accuracy: 0.9928 - val_f1_score: 0.9832 - lr: 1.0000e-06 Epoch 15/16 348/348 [==============================] - ETA: 0s - loss: 0.0166 - categorical_accuracy: 0.9967 - f1_score: 0.9937 Epoch 15: ReduceLROnPlateau reducing learning rate to 9.999999974752428e-08. 348/348 [==============================] - 122s 350ms/step - loss: 0.0166 - categorical_accuracy: 0.9967 - f1_score: 0.9937 - val_loss: 0.0286 - val_categorical_accuracy: 0.9928 - val_f1_score: 0.9832 - lr: 1.0000e-06 Epoch 16/16 348/348 [==============================] - 122s 350ms/step - loss: 0.0118 - categorical_accuracy: 0.9970 - f1_score: 0.9951 - val_loss: 0.0281 - val_categorical_accuracy: 0.9928 - val_f1_score: 0.9832 - lr: 1.0000e-07
# Actual TGT classes distribution in validation set
# print("\n:::: Validation Set ====> ACTUAL TGT Classes Distribution ::::\n")
# display(val_tgt_classes_dist)
# Plotting the Results on Validation Set
print("\n:::: Validation Set ====> PREDICTION Confusion Matrix ::::\n")
resnet_50_global_tuning_val_results = confusion_matrix_(y_val, X_val, resnet_50)
# Displaying the overall performance results
print("\n:::: Validation Set ====> FINAL Results ::::\n")
display(resnet_50_global_tuning_val_results)
:::: Validation Set ====> PREDICTION Confusion Matrix :::: 62/62 [==============================] - 7s 103ms/step
:::: Validation Set ====> FINAL Results ::::
| Healthy | Multiple_Diseases | Rust | Scab | |
|---|---|---|---|---|
| BINARY Accuracy | 0.9964 | 0.9957 | 0.9964 | 0.9964 |
| Precision | 0.9931 | 0.9866 | 0.9917 | 0.9929 |
| Recall | 0.9948 | 0.9881 | 0.9924 | 0.9929 |
| Macro F1 Score | 0.9957 | 0.9723 | 0.9977 | 0.9960 |
| Macro ROC AUC Score | 0.9960 | 0.9723 | 0.9977 | 0.9958 |
OBSERVATIONS
# Actual TGT classes distribution in TEST set
# print("\n:::: TEST Set ====> ACTUAL TGT Classes Distribution ::::\n")
# display(val_tgt_classes_dist)
# Plotting the Results on Test Set
print("\n:::: TEST Set ====> PREDICTION Confusion Matrix ::::\n")
resnet_50_global_tuning_test_results = confusion_matrix_(y_test, X_test, resnet_50)
# Displaying the overall performance results
print("\n:::: TEST Set ====> FINAL Results ::::\n")
display(resnet_50_global_tuning_test_results)
:::: TEST Set ====> PREDICTION Confusion Matrix :::: 12/12 [==============================] - 2s 145ms/step
:::: TEST Set ====> FINAL Results ::::
| Healthy | Multiple_Diseases | Rust | Scab | |
|---|---|---|---|---|
| BINARY Accuracy | 0.7589 | 0.8479 | 0.8813 | 0.8644 |
| Precision | 0.5393 | 0.5250 | 0.6847 | 0.7288 |
| Recall | 1.0000 | 0.8678 | 0.8740 | 0.7288 |
| Macro F1 Score | 0.7494 | 0.5577 | 0.9409 | 0.7393 |
| Macro ROC AUC Score | 0.8321 | 0.5455 | 0.9317 | 0.7143 |
OBSERVATIONS
Rusty images. And, quite good for Healthy & Scab TGT classes.Multiple diseases class that has the lowest +ve cases, it is predicting majority of them as negative.curr_run_logdir1.split("/")[-1]
'run_2022_11_04-09_36_27'
notebook.list()
No known TensorBoard instances running.
%tensorboard --logdir logs

OBSERVATIONS
A2.DenseNet---121¶# build the DenseNet - 121 network
densenet_121_with_no_top_model = tf.keras.applications.DenseNet121(include_top=False, weights='imagenet', input_shape=(224,224,3))
# Model summary
densenet_121_with_no_top_model.summary()
Model: "densenet121"
__________________________________________________________________________________________________
Layer (type) Output Shape Param # Connected to
==================================================================================================
input_1 (InputLayer) [(None, 224, 224, 3 0 []
)]
zero_padding2d (ZeroPadding2D) (None, 230, 230, 3) 0 ['input_1[0][0]']
conv1/conv (Conv2D) (None, 112, 112, 64 9408 ['zero_padding2d[0][0]']
)
conv1/bn (BatchNormalization) (None, 112, 112, 64 256 ['conv1/conv[0][0]']
)
conv1/relu (Activation) (None, 112, 112, 64 0 ['conv1/bn[0][0]']
)
zero_padding2d_1 (ZeroPadding2 (None, 114, 114, 64 0 ['conv1/relu[0][0]']
D) )
pool1 (MaxPooling2D) (None, 56, 56, 64) 0 ['zero_padding2d_1[0][0]']
conv2_block1_0_bn (BatchNormal (None, 56, 56, 64) 256 ['pool1[0][0]']
ization)
conv2_block1_0_relu (Activatio (None, 56, 56, 64) 0 ['conv2_block1_0_bn[0][0]']
n)
conv2_block1_1_conv (Conv2D) (None, 56, 56, 128) 8192 ['conv2_block1_0_relu[0][0]']
conv2_block1_1_bn (BatchNormal (None, 56, 56, 128) 512 ['conv2_block1_1_conv[0][0]']
ization)
conv2_block1_1_relu (Activatio (None, 56, 56, 128) 0 ['conv2_block1_1_bn[0][0]']
n)
conv2_block1_2_conv (Conv2D) (None, 56, 56, 32) 36864 ['conv2_block1_1_relu[0][0]']
conv2_block1_concat (Concatena (None, 56, 56, 96) 0 ['pool1[0][0]',
te) 'conv2_block1_2_conv[0][0]']
conv2_block2_0_bn (BatchNormal (None, 56, 56, 96) 384 ['conv2_block1_concat[0][0]']
ization)
conv2_block2_0_relu (Activatio (None, 56, 56, 96) 0 ['conv2_block2_0_bn[0][0]']
n)
conv2_block2_1_conv (Conv2D) (None, 56, 56, 128) 12288 ['conv2_block2_0_relu[0][0]']
conv2_block2_1_bn (BatchNormal (None, 56, 56, 128) 512 ['conv2_block2_1_conv[0][0]']
ization)
conv2_block2_1_relu (Activatio (None, 56, 56, 128) 0 ['conv2_block2_1_bn[0][0]']
n)
conv2_block2_2_conv (Conv2D) (None, 56, 56, 32) 36864 ['conv2_block2_1_relu[0][0]']
conv2_block2_concat (Concatena (None, 56, 56, 128) 0 ['conv2_block1_concat[0][0]',
te) 'conv2_block2_2_conv[0][0]']
conv2_block3_0_bn (BatchNormal (None, 56, 56, 128) 512 ['conv2_block2_concat[0][0]']
ization)
conv2_block3_0_relu (Activatio (None, 56, 56, 128) 0 ['conv2_block3_0_bn[0][0]']
n)
conv2_block3_1_conv (Conv2D) (None, 56, 56, 128) 16384 ['conv2_block3_0_relu[0][0]']
conv2_block3_1_bn (BatchNormal (None, 56, 56, 128) 512 ['conv2_block3_1_conv[0][0]']
ization)
conv2_block3_1_relu (Activatio (None, 56, 56, 128) 0 ['conv2_block3_1_bn[0][0]']
n)
conv2_block3_2_conv (Conv2D) (None, 56, 56, 32) 36864 ['conv2_block3_1_relu[0][0]']
conv2_block3_concat (Concatena (None, 56, 56, 160) 0 ['conv2_block2_concat[0][0]',
te) 'conv2_block3_2_conv[0][0]']
conv2_block4_0_bn (BatchNormal (None, 56, 56, 160) 640 ['conv2_block3_concat[0][0]']
ization)
conv2_block4_0_relu (Activatio (None, 56, 56, 160) 0 ['conv2_block4_0_bn[0][0]']
n)
conv2_block4_1_conv (Conv2D) (None, 56, 56, 128) 20480 ['conv2_block4_0_relu[0][0]']
conv2_block4_1_bn (BatchNormal (None, 56, 56, 128) 512 ['conv2_block4_1_conv[0][0]']
ization)
conv2_block4_1_relu (Activatio (None, 56, 56, 128) 0 ['conv2_block4_1_bn[0][0]']
n)
conv2_block4_2_conv (Conv2D) (None, 56, 56, 32) 36864 ['conv2_block4_1_relu[0][0]']
conv2_block4_concat (Concatena (None, 56, 56, 192) 0 ['conv2_block3_concat[0][0]',
te) 'conv2_block4_2_conv[0][0]']
conv2_block5_0_bn (BatchNormal (None, 56, 56, 192) 768 ['conv2_block4_concat[0][0]']
ization)
conv2_block5_0_relu (Activatio (None, 56, 56, 192) 0 ['conv2_block5_0_bn[0][0]']
n)
conv2_block5_1_conv (Conv2D) (None, 56, 56, 128) 24576 ['conv2_block5_0_relu[0][0]']
conv2_block5_1_bn (BatchNormal (None, 56, 56, 128) 512 ['conv2_block5_1_conv[0][0]']
ization)
conv2_block5_1_relu (Activatio (None, 56, 56, 128) 0 ['conv2_block5_1_bn[0][0]']
n)
conv2_block5_2_conv (Conv2D) (None, 56, 56, 32) 36864 ['conv2_block5_1_relu[0][0]']
conv2_block5_concat (Concatena (None, 56, 56, 224) 0 ['conv2_block4_concat[0][0]',
te) 'conv2_block5_2_conv[0][0]']
conv2_block6_0_bn (BatchNormal (None, 56, 56, 224) 896 ['conv2_block5_concat[0][0]']
ization)
conv2_block6_0_relu (Activatio (None, 56, 56, 224) 0 ['conv2_block6_0_bn[0][0]']
n)
conv2_block6_1_conv (Conv2D) (None, 56, 56, 128) 28672 ['conv2_block6_0_relu[0][0]']
conv2_block6_1_bn (BatchNormal (None, 56, 56, 128) 512 ['conv2_block6_1_conv[0][0]']
ization)
conv2_block6_1_relu (Activatio (None, 56, 56, 128) 0 ['conv2_block6_1_bn[0][0]']
n)
conv2_block6_2_conv (Conv2D) (None, 56, 56, 32) 36864 ['conv2_block6_1_relu[0][0]']
conv2_block6_concat (Concatena (None, 56, 56, 256) 0 ['conv2_block5_concat[0][0]',
te) 'conv2_block6_2_conv[0][0]']
pool2_bn (BatchNormalization) (None, 56, 56, 256) 1024 ['conv2_block6_concat[0][0]']
pool2_relu (Activation) (None, 56, 56, 256) 0 ['pool2_bn[0][0]']
pool2_conv (Conv2D) (None, 56, 56, 128) 32768 ['pool2_relu[0][0]']
pool2_pool (AveragePooling2D) (None, 28, 28, 128) 0 ['pool2_conv[0][0]']
conv3_block1_0_bn (BatchNormal (None, 28, 28, 128) 512 ['pool2_pool[0][0]']
ization)
conv3_block1_0_relu (Activatio (None, 28, 28, 128) 0 ['conv3_block1_0_bn[0][0]']
n)
conv3_block1_1_conv (Conv2D) (None, 28, 28, 128) 16384 ['conv3_block1_0_relu[0][0]']
conv3_block1_1_bn (BatchNormal (None, 28, 28, 128) 512 ['conv3_block1_1_conv[0][0]']
ization)
conv3_block1_1_relu (Activatio (None, 28, 28, 128) 0 ['conv3_block1_1_bn[0][0]']
n)
conv3_block1_2_conv (Conv2D) (None, 28, 28, 32) 36864 ['conv3_block1_1_relu[0][0]']
conv3_block1_concat (Concatena (None, 28, 28, 160) 0 ['pool2_pool[0][0]',
te) 'conv3_block1_2_conv[0][0]']
conv3_block2_0_bn (BatchNormal (None, 28, 28, 160) 640 ['conv3_block1_concat[0][0]']
ization)
conv3_block2_0_relu (Activatio (None, 28, 28, 160) 0 ['conv3_block2_0_bn[0][0]']
n)
conv3_block2_1_conv (Conv2D) (None, 28, 28, 128) 20480 ['conv3_block2_0_relu[0][0]']
conv3_block2_1_bn (BatchNormal (None, 28, 28, 128) 512 ['conv3_block2_1_conv[0][0]']
ization)
conv3_block2_1_relu (Activatio (None, 28, 28, 128) 0 ['conv3_block2_1_bn[0][0]']
n)
conv3_block2_2_conv (Conv2D) (None, 28, 28, 32) 36864 ['conv3_block2_1_relu[0][0]']
conv3_block2_concat (Concatena (None, 28, 28, 192) 0 ['conv3_block1_concat[0][0]',
te) 'conv3_block2_2_conv[0][0]']
conv3_block3_0_bn (BatchNormal (None, 28, 28, 192) 768 ['conv3_block2_concat[0][0]']
ization)
conv3_block3_0_relu (Activatio (None, 28, 28, 192) 0 ['conv3_block3_0_bn[0][0]']
n)
conv3_block3_1_conv (Conv2D) (None, 28, 28, 128) 24576 ['conv3_block3_0_relu[0][0]']
conv3_block3_1_bn (BatchNormal (None, 28, 28, 128) 512 ['conv3_block3_1_conv[0][0]']
ization)
conv3_block3_1_relu (Activatio (None, 28, 28, 128) 0 ['conv3_block3_1_bn[0][0]']
n)
conv3_block3_2_conv (Conv2D) (None, 28, 28, 32) 36864 ['conv3_block3_1_relu[0][0]']
conv3_block3_concat (Concatena (None, 28, 28, 224) 0 ['conv3_block2_concat[0][0]',
te) 'conv3_block3_2_conv[0][0]']
conv3_block4_0_bn (BatchNormal (None, 28, 28, 224) 896 ['conv3_block3_concat[0][0]']
ization)
conv3_block4_0_relu (Activatio (None, 28, 28, 224) 0 ['conv3_block4_0_bn[0][0]']
n)
conv3_block4_1_conv (Conv2D) (None, 28, 28, 128) 28672 ['conv3_block4_0_relu[0][0]']
conv3_block4_1_bn (BatchNormal (None, 28, 28, 128) 512 ['conv3_block4_1_conv[0][0]']
ization)
conv3_block4_1_relu (Activatio (None, 28, 28, 128) 0 ['conv3_block4_1_bn[0][0]']
n)
conv3_block4_2_conv (Conv2D) (None, 28, 28, 32) 36864 ['conv3_block4_1_relu[0][0]']
conv3_block4_concat (Concatena (None, 28, 28, 256) 0 ['conv3_block3_concat[0][0]',
te) 'conv3_block4_2_conv[0][0]']
conv3_block5_0_bn (BatchNormal (None, 28, 28, 256) 1024 ['conv3_block4_concat[0][0]']
ization)
conv3_block5_0_relu (Activatio (None, 28, 28, 256) 0 ['conv3_block5_0_bn[0][0]']
n)
conv3_block5_1_conv (Conv2D) (None, 28, 28, 128) 32768 ['conv3_block5_0_relu[0][0]']
conv3_block5_1_bn (BatchNormal (None, 28, 28, 128) 512 ['conv3_block5_1_conv[0][0]']
ization)
conv3_block5_1_relu (Activatio (None, 28, 28, 128) 0 ['conv3_block5_1_bn[0][0]']
n)
conv3_block5_2_conv (Conv2D) (None, 28, 28, 32) 36864 ['conv3_block5_1_relu[0][0]']
conv3_block5_concat (Concatena (None, 28, 28, 288) 0 ['conv3_block4_concat[0][0]',
te) 'conv3_block5_2_conv[0][0]']
conv3_block6_0_bn (BatchNormal (None, 28, 28, 288) 1152 ['conv3_block5_concat[0][0]']
ization)
conv3_block6_0_relu (Activatio (None, 28, 28, 288) 0 ['conv3_block6_0_bn[0][0]']
n)
conv3_block6_1_conv (Conv2D) (None, 28, 28, 128) 36864 ['conv3_block6_0_relu[0][0]']
conv3_block6_1_bn (BatchNormal (None, 28, 28, 128) 512 ['conv3_block6_1_conv[0][0]']
ization)
conv3_block6_1_relu (Activatio (None, 28, 28, 128) 0 ['conv3_block6_1_bn[0][0]']
n)
conv3_block6_2_conv (Conv2D) (None, 28, 28, 32) 36864 ['conv3_block6_1_relu[0][0]']
conv3_block6_concat (Concatena (None, 28, 28, 320) 0 ['conv3_block5_concat[0][0]',
te) 'conv3_block6_2_conv[0][0]']
conv3_block7_0_bn (BatchNormal (None, 28, 28, 320) 1280 ['conv3_block6_concat[0][0]']
ization)
conv3_block7_0_relu (Activatio (None, 28, 28, 320) 0 ['conv3_block7_0_bn[0][0]']
n)
conv3_block7_1_conv (Conv2D) (None, 28, 28, 128) 40960 ['conv3_block7_0_relu[0][0]']
conv3_block7_1_bn (BatchNormal (None, 28, 28, 128) 512 ['conv3_block7_1_conv[0][0]']
ization)
conv3_block7_1_relu (Activatio (None, 28, 28, 128) 0 ['conv3_block7_1_bn[0][0]']
n)
conv3_block7_2_conv (Conv2D) (None, 28, 28, 32) 36864 ['conv3_block7_1_relu[0][0]']
conv3_block7_concat (Concatena (None, 28, 28, 352) 0 ['conv3_block6_concat[0][0]',
te) 'conv3_block7_2_conv[0][0]']
conv3_block8_0_bn (BatchNormal (None, 28, 28, 352) 1408 ['conv3_block7_concat[0][0]']
ization)
conv3_block8_0_relu (Activatio (None, 28, 28, 352) 0 ['conv3_block8_0_bn[0][0]']
n)
conv3_block8_1_conv (Conv2D) (None, 28, 28, 128) 45056 ['conv3_block8_0_relu[0][0]']
conv3_block8_1_bn (BatchNormal (None, 28, 28, 128) 512 ['conv3_block8_1_conv[0][0]']
ization)
conv3_block8_1_relu (Activatio (None, 28, 28, 128) 0 ['conv3_block8_1_bn[0][0]']
n)
conv3_block8_2_conv (Conv2D) (None, 28, 28, 32) 36864 ['conv3_block8_1_relu[0][0]']
conv3_block8_concat (Concatena (None, 28, 28, 384) 0 ['conv3_block7_concat[0][0]',
te) 'conv3_block8_2_conv[0][0]']
conv3_block9_0_bn (BatchNormal (None, 28, 28, 384) 1536 ['conv3_block8_concat[0][0]']
ization)
conv3_block9_0_relu (Activatio (None, 28, 28, 384) 0 ['conv3_block9_0_bn[0][0]']
n)
conv3_block9_1_conv (Conv2D) (None, 28, 28, 128) 49152 ['conv3_block9_0_relu[0][0]']
conv3_block9_1_bn (BatchNormal (None, 28, 28, 128) 512 ['conv3_block9_1_conv[0][0]']
ization)
conv3_block9_1_relu (Activatio (None, 28, 28, 128) 0 ['conv3_block9_1_bn[0][0]']
n)
conv3_block9_2_conv (Conv2D) (None, 28, 28, 32) 36864 ['conv3_block9_1_relu[0][0]']
conv3_block9_concat (Concatena (None, 28, 28, 416) 0 ['conv3_block8_concat[0][0]',
te) 'conv3_block9_2_conv[0][0]']
conv3_block10_0_bn (BatchNorma (None, 28, 28, 416) 1664 ['conv3_block9_concat[0][0]']
lization)
conv3_block10_0_relu (Activati (None, 28, 28, 416) 0 ['conv3_block10_0_bn[0][0]']
on)
conv3_block10_1_conv (Conv2D) (None, 28, 28, 128) 53248 ['conv3_block10_0_relu[0][0]']
conv3_block10_1_bn (BatchNorma (None, 28, 28, 128) 512 ['conv3_block10_1_conv[0][0]']
lization)
conv3_block10_1_relu (Activati (None, 28, 28, 128) 0 ['conv3_block10_1_bn[0][0]']
on)
conv3_block10_2_conv (Conv2D) (None, 28, 28, 32) 36864 ['conv3_block10_1_relu[0][0]']
conv3_block10_concat (Concaten (None, 28, 28, 448) 0 ['conv3_block9_concat[0][0]',
ate) 'conv3_block10_2_conv[0][0]']
conv3_block11_0_bn (BatchNorma (None, 28, 28, 448) 1792 ['conv3_block10_concat[0][0]']
lization)
conv3_block11_0_relu (Activati (None, 28, 28, 448) 0 ['conv3_block11_0_bn[0][0]']
on)
conv3_block11_1_conv (Conv2D) (None, 28, 28, 128) 57344 ['conv3_block11_0_relu[0][0]']
conv3_block11_1_bn (BatchNorma (None, 28, 28, 128) 512 ['conv3_block11_1_conv[0][0]']
lization)
conv3_block11_1_relu (Activati (None, 28, 28, 128) 0 ['conv3_block11_1_bn[0][0]']
on)
conv3_block11_2_conv (Conv2D) (None, 28, 28, 32) 36864 ['conv3_block11_1_relu[0][0]']
conv3_block11_concat (Concaten (None, 28, 28, 480) 0 ['conv3_block10_concat[0][0]',
ate) 'conv3_block11_2_conv[0][0]']
conv3_block12_0_bn (BatchNorma (None, 28, 28, 480) 1920 ['conv3_block11_concat[0][0]']
lization)
conv3_block12_0_relu (Activati (None, 28, 28, 480) 0 ['conv3_block12_0_bn[0][0]']
on)
conv3_block12_1_conv (Conv2D) (None, 28, 28, 128) 61440 ['conv3_block12_0_relu[0][0]']
conv3_block12_1_bn (BatchNorma (None, 28, 28, 128) 512 ['conv3_block12_1_conv[0][0]']
lization)
conv3_block12_1_relu (Activati (None, 28, 28, 128) 0 ['conv3_block12_1_bn[0][0]']
on)
conv3_block12_2_conv (Conv2D) (None, 28, 28, 32) 36864 ['conv3_block12_1_relu[0][0]']
conv3_block12_concat (Concaten (None, 28, 28, 512) 0 ['conv3_block11_concat[0][0]',
ate) 'conv3_block12_2_conv[0][0]']
pool3_bn (BatchNormalization) (None, 28, 28, 512) 2048 ['conv3_block12_concat[0][0]']
pool3_relu (Activation) (None, 28, 28, 512) 0 ['pool3_bn[0][0]']
pool3_conv (Conv2D) (None, 28, 28, 256) 131072 ['pool3_relu[0][0]']
pool3_pool (AveragePooling2D) (None, 14, 14, 256) 0 ['pool3_conv[0][0]']
conv4_block1_0_bn (BatchNormal (None, 14, 14, 256) 1024 ['pool3_pool[0][0]']
ization)
conv4_block1_0_relu (Activatio (None, 14, 14, 256) 0 ['conv4_block1_0_bn[0][0]']
n)
conv4_block1_1_conv (Conv2D) (None, 14, 14, 128) 32768 ['conv4_block1_0_relu[0][0]']
conv4_block1_1_bn (BatchNormal (None, 14, 14, 128) 512 ['conv4_block1_1_conv[0][0]']
ization)
conv4_block1_1_relu (Activatio (None, 14, 14, 128) 0 ['conv4_block1_1_bn[0][0]']
n)
conv4_block1_2_conv (Conv2D) (None, 14, 14, 32) 36864 ['conv4_block1_1_relu[0][0]']
conv4_block1_concat (Concatena (None, 14, 14, 288) 0 ['pool3_pool[0][0]',
te) 'conv4_block1_2_conv[0][0]']
conv4_block2_0_bn (BatchNormal (None, 14, 14, 288) 1152 ['conv4_block1_concat[0][0]']
ization)
conv4_block2_0_relu (Activatio (None, 14, 14, 288) 0 ['conv4_block2_0_bn[0][0]']
n)
conv4_block2_1_conv (Conv2D) (None, 14, 14, 128) 36864 ['conv4_block2_0_relu[0][0]']
conv4_block2_1_bn (BatchNormal (None, 14, 14, 128) 512 ['conv4_block2_1_conv[0][0]']
ization)
conv4_block2_1_relu (Activatio (None, 14, 14, 128) 0 ['conv4_block2_1_bn[0][0]']
n)
conv4_block2_2_conv (Conv2D) (None, 14, 14, 32) 36864 ['conv4_block2_1_relu[0][0]']
conv4_block2_concat (Concatena (None, 14, 14, 320) 0 ['conv4_block1_concat[0][0]',
te) 'conv4_block2_2_conv[0][0]']
conv4_block3_0_bn (BatchNormal (None, 14, 14, 320) 1280 ['conv4_block2_concat[0][0]']
ization)
conv4_block3_0_relu (Activatio (None, 14, 14, 320) 0 ['conv4_block3_0_bn[0][0]']
n)
conv4_block3_1_conv (Conv2D) (None, 14, 14, 128) 40960 ['conv4_block3_0_relu[0][0]']
conv4_block3_1_bn (BatchNormal (None, 14, 14, 128) 512 ['conv4_block3_1_conv[0][0]']
ization)
conv4_block3_1_relu (Activatio (None, 14, 14, 128) 0 ['conv4_block3_1_bn[0][0]']
n)
conv4_block3_2_conv (Conv2D) (None, 14, 14, 32) 36864 ['conv4_block3_1_relu[0][0]']
conv4_block3_concat (Concatena (None, 14, 14, 352) 0 ['conv4_block2_concat[0][0]',
te) 'conv4_block3_2_conv[0][0]']
conv4_block4_0_bn (BatchNormal (None, 14, 14, 352) 1408 ['conv4_block3_concat[0][0]']
ization)
conv4_block4_0_relu (Activatio (None, 14, 14, 352) 0 ['conv4_block4_0_bn[0][0]']
n)
conv4_block4_1_conv (Conv2D) (None, 14, 14, 128) 45056 ['conv4_block4_0_relu[0][0]']
conv4_block4_1_bn (BatchNormal (None, 14, 14, 128) 512 ['conv4_block4_1_conv[0][0]']
ization)
conv4_block4_1_relu (Activatio (None, 14, 14, 128) 0 ['conv4_block4_1_bn[0][0]']
n)
conv4_block4_2_conv (Conv2D) (None, 14, 14, 32) 36864 ['conv4_block4_1_relu[0][0]']
conv4_block4_concat (Concatena (None, 14, 14, 384) 0 ['conv4_block3_concat[0][0]',
te) 'conv4_block4_2_conv[0][0]']
conv4_block5_0_bn (BatchNormal (None, 14, 14, 384) 1536 ['conv4_block4_concat[0][0]']
ization)
conv4_block5_0_relu (Activatio (None, 14, 14, 384) 0 ['conv4_block5_0_bn[0][0]']
n)
conv4_block5_1_conv (Conv2D) (None, 14, 14, 128) 49152 ['conv4_block5_0_relu[0][0]']
conv4_block5_1_bn (BatchNormal (None, 14, 14, 128) 512 ['conv4_block5_1_conv[0][0]']
ization)
conv4_block5_1_relu (Activatio (None, 14, 14, 128) 0 ['conv4_block5_1_bn[0][0]']
n)
conv4_block5_2_conv (Conv2D) (None, 14, 14, 32) 36864 ['conv4_block5_1_relu[0][0]']
conv4_block5_concat (Concatena (None, 14, 14, 416) 0 ['conv4_block4_concat[0][0]',
te) 'conv4_block5_2_conv[0][0]']
conv4_block6_0_bn (BatchNormal (None, 14, 14, 416) 1664 ['conv4_block5_concat[0][0]']
ization)
conv4_block6_0_relu (Activatio (None, 14, 14, 416) 0 ['conv4_block6_0_bn[0][0]']
n)
conv4_block6_1_conv (Conv2D) (None, 14, 14, 128) 53248 ['conv4_block6_0_relu[0][0]']
conv4_block6_1_bn (BatchNormal (None, 14, 14, 128) 512 ['conv4_block6_1_conv[0][0]']
ization)
conv4_block6_1_relu (Activatio (None, 14, 14, 128) 0 ['conv4_block6_1_bn[0][0]']
n)
conv4_block6_2_conv (Conv2D) (None, 14, 14, 32) 36864 ['conv4_block6_1_relu[0][0]']
conv4_block6_concat (Concatena (None, 14, 14, 448) 0 ['conv4_block5_concat[0][0]',
te) 'conv4_block6_2_conv[0][0]']
conv4_block7_0_bn (BatchNormal (None, 14, 14, 448) 1792 ['conv4_block6_concat[0][0]']
ization)
conv4_block7_0_relu (Activatio (None, 14, 14, 448) 0 ['conv4_block7_0_bn[0][0]']
n)
conv4_block7_1_conv (Conv2D) (None, 14, 14, 128) 57344 ['conv4_block7_0_relu[0][0]']
conv4_block7_1_bn (BatchNormal (None, 14, 14, 128) 512 ['conv4_block7_1_conv[0][0]']
ization)
conv4_block7_1_relu (Activatio (None, 14, 14, 128) 0 ['conv4_block7_1_bn[0][0]']
n)
conv4_block7_2_conv (Conv2D) (None, 14, 14, 32) 36864 ['conv4_block7_1_relu[0][0]']
conv4_block7_concat (Concatena (None, 14, 14, 480) 0 ['conv4_block6_concat[0][0]',
te) 'conv4_block7_2_conv[0][0]']
conv4_block8_0_bn (BatchNormal (None, 14, 14, 480) 1920 ['conv4_block7_concat[0][0]']
ization)
conv4_block8_0_relu (Activatio (None, 14, 14, 480) 0 ['conv4_block8_0_bn[0][0]']
n)
conv4_block8_1_conv (Conv2D) (None, 14, 14, 128) 61440 ['conv4_block8_0_relu[0][0]']
conv4_block8_1_bn (BatchNormal (None, 14, 14, 128) 512 ['conv4_block8_1_conv[0][0]']
ization)
conv4_block8_1_relu (Activatio (None, 14, 14, 128) 0 ['conv4_block8_1_bn[0][0]']
n)
conv4_block8_2_conv (Conv2D) (None, 14, 14, 32) 36864 ['conv4_block8_1_relu[0][0]']
conv4_block8_concat (Concatena (None, 14, 14, 512) 0 ['conv4_block7_concat[0][0]',
te) 'conv4_block8_2_conv[0][0]']
conv4_block9_0_bn (BatchNormal (None, 14, 14, 512) 2048 ['conv4_block8_concat[0][0]']
ization)
conv4_block9_0_relu (Activatio (None, 14, 14, 512) 0 ['conv4_block9_0_bn[0][0]']
n)
conv4_block9_1_conv (Conv2D) (None, 14, 14, 128) 65536 ['conv4_block9_0_relu[0][0]']
conv4_block9_1_bn (BatchNormal (None, 14, 14, 128) 512 ['conv4_block9_1_conv[0][0]']
ization)
conv4_block9_1_relu (Activatio (None, 14, 14, 128) 0 ['conv4_block9_1_bn[0][0]']
n)
conv4_block9_2_conv (Conv2D) (None, 14, 14, 32) 36864 ['conv4_block9_1_relu[0][0]']
conv4_block9_concat (Concatena (None, 14, 14, 544) 0 ['conv4_block8_concat[0][0]',
te) 'conv4_block9_2_conv[0][0]']
conv4_block10_0_bn (BatchNorma (None, 14, 14, 544) 2176 ['conv4_block9_concat[0][0]']
lization)
conv4_block10_0_relu (Activati (None, 14, 14, 544) 0 ['conv4_block10_0_bn[0][0]']
on)
conv4_block10_1_conv (Conv2D) (None, 14, 14, 128) 69632 ['conv4_block10_0_relu[0][0]']
conv4_block10_1_bn (BatchNorma (None, 14, 14, 128) 512 ['conv4_block10_1_conv[0][0]']
lization)
conv4_block10_1_relu (Activati (None, 14, 14, 128) 0 ['conv4_block10_1_bn[0][0]']
on)
conv4_block10_2_conv (Conv2D) (None, 14, 14, 32) 36864 ['conv4_block10_1_relu[0][0]']
conv4_block10_concat (Concaten (None, 14, 14, 576) 0 ['conv4_block9_concat[0][0]',
ate) 'conv4_block10_2_conv[0][0]']
conv4_block11_0_bn (BatchNorma (None, 14, 14, 576) 2304 ['conv4_block10_concat[0][0]']
lization)
conv4_block11_0_relu (Activati (None, 14, 14, 576) 0 ['conv4_block11_0_bn[0][0]']
on)
conv4_block11_1_conv (Conv2D) (None, 14, 14, 128) 73728 ['conv4_block11_0_relu[0][0]']
conv4_block11_1_bn (BatchNorma (None, 14, 14, 128) 512 ['conv4_block11_1_conv[0][0]']
lization)
conv4_block11_1_relu (Activati (None, 14, 14, 128) 0 ['conv4_block11_1_bn[0][0]']
on)
conv4_block11_2_conv (Conv2D) (None, 14, 14, 32) 36864 ['conv4_block11_1_relu[0][0]']
conv4_block11_concat (Concaten (None, 14, 14, 608) 0 ['conv4_block10_concat[0][0]',
ate) 'conv4_block11_2_conv[0][0]']
conv4_block12_0_bn (BatchNorma (None, 14, 14, 608) 2432 ['conv4_block11_concat[0][0]']
lization)
conv4_block12_0_relu (Activati (None, 14, 14, 608) 0 ['conv4_block12_0_bn[0][0]']
on)
conv4_block12_1_conv (Conv2D) (None, 14, 14, 128) 77824 ['conv4_block12_0_relu[0][0]']
conv4_block12_1_bn (BatchNorma (None, 14, 14, 128) 512 ['conv4_block12_1_conv[0][0]']
lization)
conv4_block12_1_relu (Activati (None, 14, 14, 128) 0 ['conv4_block12_1_bn[0][0]']
on)
conv4_block12_2_conv (Conv2D) (None, 14, 14, 32) 36864 ['conv4_block12_1_relu[0][0]']
conv4_block12_concat (Concaten (None, 14, 14, 640) 0 ['conv4_block11_concat[0][0]',
ate) 'conv4_block12_2_conv[0][0]']
conv4_block13_0_bn (BatchNorma (None, 14, 14, 640) 2560 ['conv4_block12_concat[0][0]']
lization)
conv4_block13_0_relu (Activati (None, 14, 14, 640) 0 ['conv4_block13_0_bn[0][0]']
on)
conv4_block13_1_conv (Conv2D) (None, 14, 14, 128) 81920 ['conv4_block13_0_relu[0][0]']
conv4_block13_1_bn (BatchNorma (None, 14, 14, 128) 512 ['conv4_block13_1_conv[0][0]']
lization)
conv4_block13_1_relu (Activati (None, 14, 14, 128) 0 ['conv4_block13_1_bn[0][0]']
on)
conv4_block13_2_conv (Conv2D) (None, 14, 14, 32) 36864 ['conv4_block13_1_relu[0][0]']
conv4_block13_concat (Concaten (None, 14, 14, 672) 0 ['conv4_block12_concat[0][0]',
ate) 'conv4_block13_2_conv[0][0]']
conv4_block14_0_bn (BatchNorma (None, 14, 14, 672) 2688 ['conv4_block13_concat[0][0]']
lization)
conv4_block14_0_relu (Activati (None, 14, 14, 672) 0 ['conv4_block14_0_bn[0][0]']
on)
conv4_block14_1_conv (Conv2D) (None, 14, 14, 128) 86016 ['conv4_block14_0_relu[0][0]']
conv4_block14_1_bn (BatchNorma (None, 14, 14, 128) 512 ['conv4_block14_1_conv[0][0]']
lization)
conv4_block14_1_relu (Activati (None, 14, 14, 128) 0 ['conv4_block14_1_bn[0][0]']
on)
conv4_block14_2_conv (Conv2D) (None, 14, 14, 32) 36864 ['conv4_block14_1_relu[0][0]']
conv4_block14_concat (Concaten (None, 14, 14, 704) 0 ['conv4_block13_concat[0][0]',
ate) 'conv4_block14_2_conv[0][0]']
conv4_block15_0_bn (BatchNorma (None, 14, 14, 704) 2816 ['conv4_block14_concat[0][0]']
lization)
conv4_block15_0_relu (Activati (None, 14, 14, 704) 0 ['conv4_block15_0_bn[0][0]']
on)
conv4_block15_1_conv (Conv2D) (None, 14, 14, 128) 90112 ['conv4_block15_0_relu[0][0]']
conv4_block15_1_bn (BatchNorma (None, 14, 14, 128) 512 ['conv4_block15_1_conv[0][0]']
lization)
conv4_block15_1_relu (Activati (None, 14, 14, 128) 0 ['conv4_block15_1_bn[0][0]']
on)
conv4_block15_2_conv (Conv2D) (None, 14, 14, 32) 36864 ['conv4_block15_1_relu[0][0]']
conv4_block15_concat (Concaten (None, 14, 14, 736) 0 ['conv4_block14_concat[0][0]',
ate) 'conv4_block15_2_conv[0][0]']
conv4_block16_0_bn (BatchNorma (None, 14, 14, 736) 2944 ['conv4_block15_concat[0][0]']
lization)
conv4_block16_0_relu (Activati (None, 14, 14, 736) 0 ['conv4_block16_0_bn[0][0]']
on)
conv4_block16_1_conv (Conv2D) (None, 14, 14, 128) 94208 ['conv4_block16_0_relu[0][0]']
conv4_block16_1_bn (BatchNorma (None, 14, 14, 128) 512 ['conv4_block16_1_conv[0][0]']
lization)
conv4_block16_1_relu (Activati (None, 14, 14, 128) 0 ['conv4_block16_1_bn[0][0]']
on)
conv4_block16_2_conv (Conv2D) (None, 14, 14, 32) 36864 ['conv4_block16_1_relu[0][0]']
conv4_block16_concat (Concaten (None, 14, 14, 768) 0 ['conv4_block15_concat[0][0]',
ate) 'conv4_block16_2_conv[0][0]']
conv4_block17_0_bn (BatchNorma (None, 14, 14, 768) 3072 ['conv4_block16_concat[0][0]']
lization)
conv4_block17_0_relu (Activati (None, 14, 14, 768) 0 ['conv4_block17_0_bn[0][0]']
on)
conv4_block17_1_conv (Conv2D) (None, 14, 14, 128) 98304 ['conv4_block17_0_relu[0][0]']
conv4_block17_1_bn (BatchNorma (None, 14, 14, 128) 512 ['conv4_block17_1_conv[0][0]']
lization)
conv4_block17_1_relu (Activati (None, 14, 14, 128) 0 ['conv4_block17_1_bn[0][0]']
on)
conv4_block17_2_conv (Conv2D) (None, 14, 14, 32) 36864 ['conv4_block17_1_relu[0][0]']
conv4_block17_concat (Concaten (None, 14, 14, 800) 0 ['conv4_block16_concat[0][0]',
ate) 'conv4_block17_2_conv[0][0]']
conv4_block18_0_bn (BatchNorma (None, 14, 14, 800) 3200 ['conv4_block17_concat[0][0]']
lization)
conv4_block18_0_relu (Activati (None, 14, 14, 800) 0 ['conv4_block18_0_bn[0][0]']
on)
conv4_block18_1_conv (Conv2D) (None, 14, 14, 128) 102400 ['conv4_block18_0_relu[0][0]']
conv4_block18_1_bn (BatchNorma (None, 14, 14, 128) 512 ['conv4_block18_1_conv[0][0]']
lization)
conv4_block18_1_relu (Activati (None, 14, 14, 128) 0 ['conv4_block18_1_bn[0][0]']
on)
conv4_block18_2_conv (Conv2D) (None, 14, 14, 32) 36864 ['conv4_block18_1_relu[0][0]']
conv4_block18_concat (Concaten (None, 14, 14, 832) 0 ['conv4_block17_concat[0][0]',
ate) 'conv4_block18_2_conv[0][0]']
conv4_block19_0_bn (BatchNorma (None, 14, 14, 832) 3328 ['conv4_block18_concat[0][0]']
lization)
conv4_block19_0_relu (Activati (None, 14, 14, 832) 0 ['conv4_block19_0_bn[0][0]']
on)
conv4_block19_1_conv (Conv2D) (None, 14, 14, 128) 106496 ['conv4_block19_0_relu[0][0]']
conv4_block19_1_bn (BatchNorma (None, 14, 14, 128) 512 ['conv4_block19_1_conv[0][0]']
lization)
conv4_block19_1_relu (Activati (None, 14, 14, 128) 0 ['conv4_block19_1_bn[0][0]']
on)
conv4_block19_2_conv (Conv2D) (None, 14, 14, 32) 36864 ['conv4_block19_1_relu[0][0]']
conv4_block19_concat (Concaten (None, 14, 14, 864) 0 ['conv4_block18_concat[0][0]',
ate) 'conv4_block19_2_conv[0][0]']
conv4_block20_0_bn (BatchNorma (None, 14, 14, 864) 3456 ['conv4_block19_concat[0][0]']
lization)
conv4_block20_0_relu (Activati (None, 14, 14, 864) 0 ['conv4_block20_0_bn[0][0]']
on)
conv4_block20_1_conv (Conv2D) (None, 14, 14, 128) 110592 ['conv4_block20_0_relu[0][0]']
conv4_block20_1_bn (BatchNorma (None, 14, 14, 128) 512 ['conv4_block20_1_conv[0][0]']
lization)
conv4_block20_1_relu (Activati (None, 14, 14, 128) 0 ['conv4_block20_1_bn[0][0]']
on)
conv4_block20_2_conv (Conv2D) (None, 14, 14, 32) 36864 ['conv4_block20_1_relu[0][0]']
conv4_block20_concat (Concaten (None, 14, 14, 896) 0 ['conv4_block19_concat[0][0]',
ate) 'conv4_block20_2_conv[0][0]']
conv4_block21_0_bn (BatchNorma (None, 14, 14, 896) 3584 ['conv4_block20_concat[0][0]']
lization)
conv4_block21_0_relu (Activati (None, 14, 14, 896) 0 ['conv4_block21_0_bn[0][0]']
on)
conv4_block21_1_conv (Conv2D) (None, 14, 14, 128) 114688 ['conv4_block21_0_relu[0][0]']
conv4_block21_1_bn (BatchNorma (None, 14, 14, 128) 512 ['conv4_block21_1_conv[0][0]']
lization)
conv4_block21_1_relu (Activati (None, 14, 14, 128) 0 ['conv4_block21_1_bn[0][0]']
on)
conv4_block21_2_conv (Conv2D) (None, 14, 14, 32) 36864 ['conv4_block21_1_relu[0][0]']
conv4_block21_concat (Concaten (None, 14, 14, 928) 0 ['conv4_block20_concat[0][0]',
ate) 'conv4_block21_2_conv[0][0]']
conv4_block22_0_bn (BatchNorma (None, 14, 14, 928) 3712 ['conv4_block21_concat[0][0]']
lization)
conv4_block22_0_relu (Activati (None, 14, 14, 928) 0 ['conv4_block22_0_bn[0][0]']
on)
conv4_block22_1_conv (Conv2D) (None, 14, 14, 128) 118784 ['conv4_block22_0_relu[0][0]']
conv4_block22_1_bn (BatchNorma (None, 14, 14, 128) 512 ['conv4_block22_1_conv[0][0]']
lization)
conv4_block22_1_relu (Activati (None, 14, 14, 128) 0 ['conv4_block22_1_bn[0][0]']
on)
conv4_block22_2_conv (Conv2D) (None, 14, 14, 32) 36864 ['conv4_block22_1_relu[0][0]']
conv4_block22_concat (Concaten (None, 14, 14, 960) 0 ['conv4_block21_concat[0][0]',
ate) 'conv4_block22_2_conv[0][0]']
conv4_block23_0_bn (BatchNorma (None, 14, 14, 960) 3840 ['conv4_block22_concat[0][0]']
lization)
conv4_block23_0_relu (Activati (None, 14, 14, 960) 0 ['conv4_block23_0_bn[0][0]']
on)
conv4_block23_1_conv (Conv2D) (None, 14, 14, 128) 122880 ['conv4_block23_0_relu[0][0]']
conv4_block23_1_bn (BatchNorma (None, 14, 14, 128) 512 ['conv4_block23_1_conv[0][0]']
lization)
conv4_block23_1_relu (Activati (None, 14, 14, 128) 0 ['conv4_block23_1_bn[0][0]']
on)
conv4_block23_2_conv (Conv2D) (None, 14, 14, 32) 36864 ['conv4_block23_1_relu[0][0]']
conv4_block23_concat (Concaten (None, 14, 14, 992) 0 ['conv4_block22_concat[0][0]',
ate) 'conv4_block23_2_conv[0][0]']
conv4_block24_0_bn (BatchNorma (None, 14, 14, 992) 3968 ['conv4_block23_concat[0][0]']
lization)
conv4_block24_0_relu (Activati (None, 14, 14, 992) 0 ['conv4_block24_0_bn[0][0]']
on)
conv4_block24_1_conv (Conv2D) (None, 14, 14, 128) 126976 ['conv4_block24_0_relu[0][0]']
conv4_block24_1_bn (BatchNorma (None, 14, 14, 128) 512 ['conv4_block24_1_conv[0][0]']
lization)
conv4_block24_1_relu (Activati (None, 14, 14, 128) 0 ['conv4_block24_1_bn[0][0]']
on)
conv4_block24_2_conv (Conv2D) (None, 14, 14, 32) 36864 ['conv4_block24_1_relu[0][0]']
conv4_block24_concat (Concaten (None, 14, 14, 1024 0 ['conv4_block23_concat[0][0]',
ate) ) 'conv4_block24_2_conv[0][0]']
pool4_bn (BatchNormalization) (None, 14, 14, 1024 4096 ['conv4_block24_concat[0][0]']
)
pool4_relu (Activation) (None, 14, 14, 1024 0 ['pool4_bn[0][0]']
)
pool4_conv (Conv2D) (None, 14, 14, 512) 524288 ['pool4_relu[0][0]']
pool4_pool (AveragePooling2D) (None, 7, 7, 512) 0 ['pool4_conv[0][0]']
conv5_block1_0_bn (BatchNormal (None, 7, 7, 512) 2048 ['pool4_pool[0][0]']
ization)
conv5_block1_0_relu (Activatio (None, 7, 7, 512) 0 ['conv5_block1_0_bn[0][0]']
n)
conv5_block1_1_conv (Conv2D) (None, 7, 7, 128) 65536 ['conv5_block1_0_relu[0][0]']
conv5_block1_1_bn (BatchNormal (None, 7, 7, 128) 512 ['conv5_block1_1_conv[0][0]']
ization)
conv5_block1_1_relu (Activatio (None, 7, 7, 128) 0 ['conv5_block1_1_bn[0][0]']
n)
conv5_block1_2_conv (Conv2D) (None, 7, 7, 32) 36864 ['conv5_block1_1_relu[0][0]']
conv5_block1_concat (Concatena (None, 7, 7, 544) 0 ['pool4_pool[0][0]',
te) 'conv5_block1_2_conv[0][0]']
conv5_block2_0_bn (BatchNormal (None, 7, 7, 544) 2176 ['conv5_block1_concat[0][0]']
ization)
conv5_block2_0_relu (Activatio (None, 7, 7, 544) 0 ['conv5_block2_0_bn[0][0]']
n)
conv5_block2_1_conv (Conv2D) (None, 7, 7, 128) 69632 ['conv5_block2_0_relu[0][0]']
conv5_block2_1_bn (BatchNormal (None, 7, 7, 128) 512 ['conv5_block2_1_conv[0][0]']
ization)
conv5_block2_1_relu (Activatio (None, 7, 7, 128) 0 ['conv5_block2_1_bn[0][0]']
n)
conv5_block2_2_conv (Conv2D) (None, 7, 7, 32) 36864 ['conv5_block2_1_relu[0][0]']
conv5_block2_concat (Concatena (None, 7, 7, 576) 0 ['conv5_block1_concat[0][0]',
te) 'conv5_block2_2_conv[0][0]']
conv5_block3_0_bn (BatchNormal (None, 7, 7, 576) 2304 ['conv5_block2_concat[0][0]']
ization)
conv5_block3_0_relu (Activatio (None, 7, 7, 576) 0 ['conv5_block3_0_bn[0][0]']
n)
conv5_block3_1_conv (Conv2D) (None, 7, 7, 128) 73728 ['conv5_block3_0_relu[0][0]']
conv5_block3_1_bn (BatchNormal (None, 7, 7, 128) 512 ['conv5_block3_1_conv[0][0]']
ization)
conv5_block3_1_relu (Activatio (None, 7, 7, 128) 0 ['conv5_block3_1_bn[0][0]']
n)
conv5_block3_2_conv (Conv2D) (None, 7, 7, 32) 36864 ['conv5_block3_1_relu[0][0]']
conv5_block3_concat (Concatena (None, 7, 7, 608) 0 ['conv5_block2_concat[0][0]',
te) 'conv5_block3_2_conv[0][0]']
conv5_block4_0_bn (BatchNormal (None, 7, 7, 608) 2432 ['conv5_block3_concat[0][0]']
ization)
conv5_block4_0_relu (Activatio (None, 7, 7, 608) 0 ['conv5_block4_0_bn[0][0]']
n)
conv5_block4_1_conv (Conv2D) (None, 7, 7, 128) 77824 ['conv5_block4_0_relu[0][0]']
conv5_block4_1_bn (BatchNormal (None, 7, 7, 128) 512 ['conv5_block4_1_conv[0][0]']
ization)
conv5_block4_1_relu (Activatio (None, 7, 7, 128) 0 ['conv5_block4_1_bn[0][0]']
n)
conv5_block4_2_conv (Conv2D) (None, 7, 7, 32) 36864 ['conv5_block4_1_relu[0][0]']
conv5_block4_concat (Concatena (None, 7, 7, 640) 0 ['conv5_block3_concat[0][0]',
te) 'conv5_block4_2_conv[0][0]']
conv5_block5_0_bn (BatchNormal (None, 7, 7, 640) 2560 ['conv5_block4_concat[0][0]']
ization)
conv5_block5_0_relu (Activatio (None, 7, 7, 640) 0 ['conv5_block5_0_bn[0][0]']
n)
conv5_block5_1_conv (Conv2D) (None, 7, 7, 128) 81920 ['conv5_block5_0_relu[0][0]']
conv5_block5_1_bn (BatchNormal (None, 7, 7, 128) 512 ['conv5_block5_1_conv[0][0]']
ization)
conv5_block5_1_relu (Activatio (None, 7, 7, 128) 0 ['conv5_block5_1_bn[0][0]']
n)
conv5_block5_2_conv (Conv2D) (None, 7, 7, 32) 36864 ['conv5_block5_1_relu[0][0]']
conv5_block5_concat (Concatena (None, 7, 7, 672) 0 ['conv5_block4_concat[0][0]',
te) 'conv5_block5_2_conv[0][0]']
conv5_block6_0_bn (BatchNormal (None, 7, 7, 672) 2688 ['conv5_block5_concat[0][0]']
ization)
conv5_block6_0_relu (Activatio (None, 7, 7, 672) 0 ['conv5_block6_0_bn[0][0]']
n)
conv5_block6_1_conv (Conv2D) (None, 7, 7, 128) 86016 ['conv5_block6_0_relu[0][0]']
conv5_block6_1_bn (BatchNormal (None, 7, 7, 128) 512 ['conv5_block6_1_conv[0][0]']
ization)
conv5_block6_1_relu (Activatio (None, 7, 7, 128) 0 ['conv5_block6_1_bn[0][0]']
n)
conv5_block6_2_conv (Conv2D) (None, 7, 7, 32) 36864 ['conv5_block6_1_relu[0][0]']
conv5_block6_concat (Concatena (None, 7, 7, 704) 0 ['conv5_block5_concat[0][0]',
te) 'conv5_block6_2_conv[0][0]']
conv5_block7_0_bn (BatchNormal (None, 7, 7, 704) 2816 ['conv5_block6_concat[0][0]']
ization)
conv5_block7_0_relu (Activatio (None, 7, 7, 704) 0 ['conv5_block7_0_bn[0][0]']
n)
conv5_block7_1_conv (Conv2D) (None, 7, 7, 128) 90112 ['conv5_block7_0_relu[0][0]']
conv5_block7_1_bn (BatchNormal (None, 7, 7, 128) 512 ['conv5_block7_1_conv[0][0]']
ization)
conv5_block7_1_relu (Activatio (None, 7, 7, 128) 0 ['conv5_block7_1_bn[0][0]']
n)
conv5_block7_2_conv (Conv2D) (None, 7, 7, 32) 36864 ['conv5_block7_1_relu[0][0]']
conv5_block7_concat (Concatena (None, 7, 7, 736) 0 ['conv5_block6_concat[0][0]',
te) 'conv5_block7_2_conv[0][0]']
conv5_block8_0_bn (BatchNormal (None, 7, 7, 736) 2944 ['conv5_block7_concat[0][0]']
ization)
conv5_block8_0_relu (Activatio (None, 7, 7, 736) 0 ['conv5_block8_0_bn[0][0]']
n)
conv5_block8_1_conv (Conv2D) (None, 7, 7, 128) 94208 ['conv5_block8_0_relu[0][0]']
conv5_block8_1_bn (BatchNormal (None, 7, 7, 128) 512 ['conv5_block8_1_conv[0][0]']
ization)
conv5_block8_1_relu (Activatio (None, 7, 7, 128) 0 ['conv5_block8_1_bn[0][0]']
n)
conv5_block8_2_conv (Conv2D) (None, 7, 7, 32) 36864 ['conv5_block8_1_relu[0][0]']
conv5_block8_concat (Concatena (None, 7, 7, 768) 0 ['conv5_block7_concat[0][0]',
te) 'conv5_block8_2_conv[0][0]']
conv5_block9_0_bn (BatchNormal (None, 7, 7, 768) 3072 ['conv5_block8_concat[0][0]']
ization)
conv5_block9_0_relu (Activatio (None, 7, 7, 768) 0 ['conv5_block9_0_bn[0][0]']
n)
conv5_block9_1_conv (Conv2D) (None, 7, 7, 128) 98304 ['conv5_block9_0_relu[0][0]']
conv5_block9_1_bn (BatchNormal (None, 7, 7, 128) 512 ['conv5_block9_1_conv[0][0]']
ization)
conv5_block9_1_relu (Activatio (None, 7, 7, 128) 0 ['conv5_block9_1_bn[0][0]']
n)
conv5_block9_2_conv (Conv2D) (None, 7, 7, 32) 36864 ['conv5_block9_1_relu[0][0]']
conv5_block9_concat (Concatena (None, 7, 7, 800) 0 ['conv5_block8_concat[0][0]',
te) 'conv5_block9_2_conv[0][0]']
conv5_block10_0_bn (BatchNorma (None, 7, 7, 800) 3200 ['conv5_block9_concat[0][0]']
lization)
conv5_block10_0_relu (Activati (None, 7, 7, 800) 0 ['conv5_block10_0_bn[0][0]']
on)
conv5_block10_1_conv (Conv2D) (None, 7, 7, 128) 102400 ['conv5_block10_0_relu[0][0]']
conv5_block10_1_bn (BatchNorma (None, 7, 7, 128) 512 ['conv5_block10_1_conv[0][0]']
lization)
conv5_block10_1_relu (Activati (None, 7, 7, 128) 0 ['conv5_block10_1_bn[0][0]']
on)
conv5_block10_2_conv (Conv2D) (None, 7, 7, 32) 36864 ['conv5_block10_1_relu[0][0]']
conv5_block10_concat (Concaten (None, 7, 7, 832) 0 ['conv5_block9_concat[0][0]',
ate) 'conv5_block10_2_conv[0][0]']
conv5_block11_0_bn (BatchNorma (None, 7, 7, 832) 3328 ['conv5_block10_concat[0][0]']
lization)
conv5_block11_0_relu (Activati (None, 7, 7, 832) 0 ['conv5_block11_0_bn[0][0]']
on)
conv5_block11_1_conv (Conv2D) (None, 7, 7, 128) 106496 ['conv5_block11_0_relu[0][0]']
conv5_block11_1_bn (BatchNorma (None, 7, 7, 128) 512 ['conv5_block11_1_conv[0][0]']
lization)
conv5_block11_1_relu (Activati (None, 7, 7, 128) 0 ['conv5_block11_1_bn[0][0]']
on)
conv5_block11_2_conv (Conv2D) (None, 7, 7, 32) 36864 ['conv5_block11_1_relu[0][0]']
conv5_block11_concat (Concaten (None, 7, 7, 864) 0 ['conv5_block10_concat[0][0]',
ate) 'conv5_block11_2_conv[0][0]']
conv5_block12_0_bn (BatchNorma (None, 7, 7, 864) 3456 ['conv5_block11_concat[0][0]']
lization)
conv5_block12_0_relu (Activati (None, 7, 7, 864) 0 ['conv5_block12_0_bn[0][0]']
on)
conv5_block12_1_conv (Conv2D) (None, 7, 7, 128) 110592 ['conv5_block12_0_relu[0][0]']
conv5_block12_1_bn (BatchNorma (None, 7, 7, 128) 512 ['conv5_block12_1_conv[0][0]']
lization)
conv5_block12_1_relu (Activati (None, 7, 7, 128) 0 ['conv5_block12_1_bn[0][0]']
on)
conv5_block12_2_conv (Conv2D) (None, 7, 7, 32) 36864 ['conv5_block12_1_relu[0][0]']
conv5_block12_concat (Concaten (None, 7, 7, 896) 0 ['conv5_block11_concat[0][0]',
ate) 'conv5_block12_2_conv[0][0]']
conv5_block13_0_bn (BatchNorma (None, 7, 7, 896) 3584 ['conv5_block12_concat[0][0]']
lization)
conv5_block13_0_relu (Activati (None, 7, 7, 896) 0 ['conv5_block13_0_bn[0][0]']
on)
conv5_block13_1_conv (Conv2D) (None, 7, 7, 128) 114688 ['conv5_block13_0_relu[0][0]']
conv5_block13_1_bn (BatchNorma (None, 7, 7, 128) 512 ['conv5_block13_1_conv[0][0]']
lization)
conv5_block13_1_relu (Activati (None, 7, 7, 128) 0 ['conv5_block13_1_bn[0][0]']
on)
conv5_block13_2_conv (Conv2D) (None, 7, 7, 32) 36864 ['conv5_block13_1_relu[0][0]']
conv5_block13_concat (Concaten (None, 7, 7, 928) 0 ['conv5_block12_concat[0][0]',
ate) 'conv5_block13_2_conv[0][0]']
conv5_block14_0_bn (BatchNorma (None, 7, 7, 928) 3712 ['conv5_block13_concat[0][0]']
lization)
conv5_block14_0_relu (Activati (None, 7, 7, 928) 0 ['conv5_block14_0_bn[0][0]']
on)
conv5_block14_1_conv (Conv2D) (None, 7, 7, 128) 118784 ['conv5_block14_0_relu[0][0]']
conv5_block14_1_bn (BatchNorma (None, 7, 7, 128) 512 ['conv5_block14_1_conv[0][0]']
lization)
conv5_block14_1_relu (Activati (None, 7, 7, 128) 0 ['conv5_block14_1_bn[0][0]']
on)
conv5_block14_2_conv (Conv2D) (None, 7, 7, 32) 36864 ['conv5_block14_1_relu[0][0]']
conv5_block14_concat (Concaten (None, 7, 7, 960) 0 ['conv5_block13_concat[0][0]',
ate) 'conv5_block14_2_conv[0][0]']
conv5_block15_0_bn (BatchNorma (None, 7, 7, 960) 3840 ['conv5_block14_concat[0][0]']
lization)
conv5_block15_0_relu (Activati (None, 7, 7, 960) 0 ['conv5_block15_0_bn[0][0]']
on)
conv5_block15_1_conv (Conv2D) (None, 7, 7, 128) 122880 ['conv5_block15_0_relu[0][0]']
conv5_block15_1_bn (BatchNorma (None, 7, 7, 128) 512 ['conv5_block15_1_conv[0][0]']
lization)
conv5_block15_1_relu (Activati (None, 7, 7, 128) 0 ['conv5_block15_1_bn[0][0]']
on)
conv5_block15_2_conv (Conv2D) (None, 7, 7, 32) 36864 ['conv5_block15_1_relu[0][0]']
conv5_block15_concat (Concaten (None, 7, 7, 992) 0 ['conv5_block14_concat[0][0]',
ate) 'conv5_block15_2_conv[0][0]']
conv5_block16_0_bn (BatchNorma (None, 7, 7, 992) 3968 ['conv5_block15_concat[0][0]']
lization)
conv5_block16_0_relu (Activati (None, 7, 7, 992) 0 ['conv5_block16_0_bn[0][0]']
on)
conv5_block16_1_conv (Conv2D) (None, 7, 7, 128) 126976 ['conv5_block16_0_relu[0][0]']
conv5_block16_1_bn (BatchNorma (None, 7, 7, 128) 512 ['conv5_block16_1_conv[0][0]']
lization)
conv5_block16_1_relu (Activati (None, 7, 7, 128) 0 ['conv5_block16_1_bn[0][0]']
on)
conv5_block16_2_conv (Conv2D) (None, 7, 7, 32) 36864 ['conv5_block16_1_relu[0][0]']
conv5_block16_concat (Concaten (None, 7, 7, 1024) 0 ['conv5_block15_concat[0][0]',
ate) 'conv5_block16_2_conv[0][0]']
bn (BatchNormalization) (None, 7, 7, 1024) 4096 ['conv5_block16_concat[0][0]']
relu (Activation) (None, 7, 7, 1024) 0 ['bn[0][0]']
==================================================================================================
Total params: 7,037,504
Trainable params: 6,953,856
Non-trainable params: 83,648
__________________________________________________________________________________________________
# Instantiating Optimizer
learning_rate= 0.0001
opt2 = tf.keras.optimizers.Adam(learning_rate=learning_rate)
# Reduce Learning Rate on Plateau
reduce_lr2 = tf.keras.callbacks.ReduceLROnPlateau(monitor='val_loss', factor=0.1, patience=2, verbose=1, mode='auto', min_delta=0.0001)
# Logs directory
curr_run_logdir2 = get_run_logdir()
# Instantiating Tensorboard callback
tensorboard_callback2 = TensorBoard(log_dir=curr_run_logdir2, histogram_freq=1)
# Setting the seed
os.environ['PYTHONHASHSEED'] = '0'
# Clearing the TF session
tf.keras.backend.clear_session()
# defining the custom top of the DenseNet-121 model
for layer in densenet_121_with_no_top_model.layers:
layer.trainable = True
# Adding additional layers
input_layer = densenet_121_with_no_top_model.output
# Defining the top layers structure of the model
flatten = tf.keras.layers.GlobalAveragePooling2D(name='Flatten_for_hidden_layers')(input_layer)
dropout_1 = Dropout(rate=0.5, name='Dropout1')(flatten)
dense_layer1 = tf.keras.layers.Dense(units=128,
activation='relu',
use_bias=True,
kernel_initializer=tf.keras.initializers.he_normal(seed=80),
bias_initializer=tf.keras.initializers.he_normal(seed=110),
name='Hidden_Layer1')(dropout_1)
dropout_2 = Dropout(rate=0.5, name='Dropout2')(dense_layer1)
dense_layer2 = tf.keras.layers.Dense(units=128,
activation='relu',
use_bias=True,
kernel_initializer=tf.keras.initializers.he_normal(seed=80),
bias_initializer=tf.keras.initializers.he_normal(seed=110),
name='Hidden_Layer2')(dropout_2)
dropout_3 = Dropout(rate=0.5, name='Dropout3')(dense_layer2)
dense_layer3 = tf.keras.layers.Dense(units=128,
activation='relu',
use_bias=True,
kernel_initializer=tf.keras.initializers.he_normal(seed=80),
bias_initializer=tf.keras.initializers.he_normal(seed=110),
name='Hidden_Layer3')(dropout_3)
output_layer = tf.keras.layers.Dense(4, activation='softmax', name="output")(dense_layer3)
# Instantiating the complete model
densenet_121 = Model(inputs=densenet_121_with_no_top_model.input, outputs=output_layer)
# Compiling the model
densenet_121.compile(optimizer=opt2,
loss = 'categorical_crossentropy',
metrics=['categorical_accuracy', tfa_f1_scr])
# Summary of the DenseNet - 121 with custom top
densenet_121.summary()
Model: "model"
__________________________________________________________________________________________________
Layer (type) Output Shape Param # Connected to
==================================================================================================
input_1 (InputLayer) [(None, 224, 224, 3 0 []
)]
zero_padding2d (ZeroPadding2D) (None, 230, 230, 3) 0 ['input_1[0][0]']
conv1/conv (Conv2D) (None, 112, 112, 64 9408 ['zero_padding2d[0][0]']
)
conv1/bn (BatchNormalization) (None, 112, 112, 64 256 ['conv1/conv[0][0]']
)
conv1/relu (Activation) (None, 112, 112, 64 0 ['conv1/bn[0][0]']
)
zero_padding2d_1 (ZeroPadding2 (None, 114, 114, 64 0 ['conv1/relu[0][0]']
D) )
pool1 (MaxPooling2D) (None, 56, 56, 64) 0 ['zero_padding2d_1[0][0]']
conv2_block1_0_bn (BatchNormal (None, 56, 56, 64) 256 ['pool1[0][0]']
ization)
conv2_block1_0_relu (Activatio (None, 56, 56, 64) 0 ['conv2_block1_0_bn[0][0]']
n)
conv2_block1_1_conv (Conv2D) (None, 56, 56, 128) 8192 ['conv2_block1_0_relu[0][0]']
conv2_block1_1_bn (BatchNormal (None, 56, 56, 128) 512 ['conv2_block1_1_conv[0][0]']
ization)
conv2_block1_1_relu (Activatio (None, 56, 56, 128) 0 ['conv2_block1_1_bn[0][0]']
n)
conv2_block1_2_conv (Conv2D) (None, 56, 56, 32) 36864 ['conv2_block1_1_relu[0][0]']
conv2_block1_concat (Concatena (None, 56, 56, 96) 0 ['pool1[0][0]',
te) 'conv2_block1_2_conv[0][0]']
conv2_block2_0_bn (BatchNormal (None, 56, 56, 96) 384 ['conv2_block1_concat[0][0]']
ization)
conv2_block2_0_relu (Activatio (None, 56, 56, 96) 0 ['conv2_block2_0_bn[0][0]']
n)
conv2_block2_1_conv (Conv2D) (None, 56, 56, 128) 12288 ['conv2_block2_0_relu[0][0]']
conv2_block2_1_bn (BatchNormal (None, 56, 56, 128) 512 ['conv2_block2_1_conv[0][0]']
ization)
conv2_block2_1_relu (Activatio (None, 56, 56, 128) 0 ['conv2_block2_1_bn[0][0]']
n)
conv2_block2_2_conv (Conv2D) (None, 56, 56, 32) 36864 ['conv2_block2_1_relu[0][0]']
conv2_block2_concat (Concatena (None, 56, 56, 128) 0 ['conv2_block1_concat[0][0]',
te) 'conv2_block2_2_conv[0][0]']
conv2_block3_0_bn (BatchNormal (None, 56, 56, 128) 512 ['conv2_block2_concat[0][0]']
ization)
conv2_block3_0_relu (Activatio (None, 56, 56, 128) 0 ['conv2_block3_0_bn[0][0]']
n)
conv2_block3_1_conv (Conv2D) (None, 56, 56, 128) 16384 ['conv2_block3_0_relu[0][0]']
conv2_block3_1_bn (BatchNormal (None, 56, 56, 128) 512 ['conv2_block3_1_conv[0][0]']
ization)
conv2_block3_1_relu (Activatio (None, 56, 56, 128) 0 ['conv2_block3_1_bn[0][0]']
n)
conv2_block3_2_conv (Conv2D) (None, 56, 56, 32) 36864 ['conv2_block3_1_relu[0][0]']
conv2_block3_concat (Concatena (None, 56, 56, 160) 0 ['conv2_block2_concat[0][0]',
te) 'conv2_block3_2_conv[0][0]']
conv2_block4_0_bn (BatchNormal (None, 56, 56, 160) 640 ['conv2_block3_concat[0][0]']
ization)
conv2_block4_0_relu (Activatio (None, 56, 56, 160) 0 ['conv2_block4_0_bn[0][0]']
n)
conv2_block4_1_conv (Conv2D) (None, 56, 56, 128) 20480 ['conv2_block4_0_relu[0][0]']
conv2_block4_1_bn (BatchNormal (None, 56, 56, 128) 512 ['conv2_block4_1_conv[0][0]']
ization)
conv2_block4_1_relu (Activatio (None, 56, 56, 128) 0 ['conv2_block4_1_bn[0][0]']
n)
conv2_block4_2_conv (Conv2D) (None, 56, 56, 32) 36864 ['conv2_block4_1_relu[0][0]']
conv2_block4_concat (Concatena (None, 56, 56, 192) 0 ['conv2_block3_concat[0][0]',
te) 'conv2_block4_2_conv[0][0]']
conv2_block5_0_bn (BatchNormal (None, 56, 56, 192) 768 ['conv2_block4_concat[0][0]']
ization)
conv2_block5_0_relu (Activatio (None, 56, 56, 192) 0 ['conv2_block5_0_bn[0][0]']
n)
conv2_block5_1_conv (Conv2D) (None, 56, 56, 128) 24576 ['conv2_block5_0_relu[0][0]']
conv2_block5_1_bn (BatchNormal (None, 56, 56, 128) 512 ['conv2_block5_1_conv[0][0]']
ization)
conv2_block5_1_relu (Activatio (None, 56, 56, 128) 0 ['conv2_block5_1_bn[0][0]']
n)
conv2_block5_2_conv (Conv2D) (None, 56, 56, 32) 36864 ['conv2_block5_1_relu[0][0]']
conv2_block5_concat (Concatena (None, 56, 56, 224) 0 ['conv2_block4_concat[0][0]',
te) 'conv2_block5_2_conv[0][0]']
conv2_block6_0_bn (BatchNormal (None, 56, 56, 224) 896 ['conv2_block5_concat[0][0]']
ization)
conv2_block6_0_relu (Activatio (None, 56, 56, 224) 0 ['conv2_block6_0_bn[0][0]']
n)
conv2_block6_1_conv (Conv2D) (None, 56, 56, 128) 28672 ['conv2_block6_0_relu[0][0]']
conv2_block6_1_bn (BatchNormal (None, 56, 56, 128) 512 ['conv2_block6_1_conv[0][0]']
ization)
conv2_block6_1_relu (Activatio (None, 56, 56, 128) 0 ['conv2_block6_1_bn[0][0]']
n)
conv2_block6_2_conv (Conv2D) (None, 56, 56, 32) 36864 ['conv2_block6_1_relu[0][0]']
conv2_block6_concat (Concatena (None, 56, 56, 256) 0 ['conv2_block5_concat[0][0]',
te) 'conv2_block6_2_conv[0][0]']
pool2_bn (BatchNormalization) (None, 56, 56, 256) 1024 ['conv2_block6_concat[0][0]']
pool2_relu (Activation) (None, 56, 56, 256) 0 ['pool2_bn[0][0]']
pool2_conv (Conv2D) (None, 56, 56, 128) 32768 ['pool2_relu[0][0]']
pool2_pool (AveragePooling2D) (None, 28, 28, 128) 0 ['pool2_conv[0][0]']
conv3_block1_0_bn (BatchNormal (None, 28, 28, 128) 512 ['pool2_pool[0][0]']
ization)
conv3_block1_0_relu (Activatio (None, 28, 28, 128) 0 ['conv3_block1_0_bn[0][0]']
n)
conv3_block1_1_conv (Conv2D) (None, 28, 28, 128) 16384 ['conv3_block1_0_relu[0][0]']
conv3_block1_1_bn (BatchNormal (None, 28, 28, 128) 512 ['conv3_block1_1_conv[0][0]']
ization)
conv3_block1_1_relu (Activatio (None, 28, 28, 128) 0 ['conv3_block1_1_bn[0][0]']
n)
conv3_block1_2_conv (Conv2D) (None, 28, 28, 32) 36864 ['conv3_block1_1_relu[0][0]']
conv3_block1_concat (Concatena (None, 28, 28, 160) 0 ['pool2_pool[0][0]',
te) 'conv3_block1_2_conv[0][0]']
conv3_block2_0_bn (BatchNormal (None, 28, 28, 160) 640 ['conv3_block1_concat[0][0]']
ization)
conv3_block2_0_relu (Activatio (None, 28, 28, 160) 0 ['conv3_block2_0_bn[0][0]']
n)
conv3_block2_1_conv (Conv2D) (None, 28, 28, 128) 20480 ['conv3_block2_0_relu[0][0]']
conv3_block2_1_bn (BatchNormal (None, 28, 28, 128) 512 ['conv3_block2_1_conv[0][0]']
ization)
conv3_block2_1_relu (Activatio (None, 28, 28, 128) 0 ['conv3_block2_1_bn[0][0]']
n)
conv3_block2_2_conv (Conv2D) (None, 28, 28, 32) 36864 ['conv3_block2_1_relu[0][0]']
conv3_block2_concat (Concatena (None, 28, 28, 192) 0 ['conv3_block1_concat[0][0]',
te) 'conv3_block2_2_conv[0][0]']
conv3_block3_0_bn (BatchNormal (None, 28, 28, 192) 768 ['conv3_block2_concat[0][0]']
ization)
conv3_block3_0_relu (Activatio (None, 28, 28, 192) 0 ['conv3_block3_0_bn[0][0]']
n)
conv3_block3_1_conv (Conv2D) (None, 28, 28, 128) 24576 ['conv3_block3_0_relu[0][0]']
conv3_block3_1_bn (BatchNormal (None, 28, 28, 128) 512 ['conv3_block3_1_conv[0][0]']
ization)
conv3_block3_1_relu (Activatio (None, 28, 28, 128) 0 ['conv3_block3_1_bn[0][0]']
n)
conv3_block3_2_conv (Conv2D) (None, 28, 28, 32) 36864 ['conv3_block3_1_relu[0][0]']
conv3_block3_concat (Concatena (None, 28, 28, 224) 0 ['conv3_block2_concat[0][0]',
te) 'conv3_block3_2_conv[0][0]']
conv3_block4_0_bn (BatchNormal (None, 28, 28, 224) 896 ['conv3_block3_concat[0][0]']
ization)
conv3_block4_0_relu (Activatio (None, 28, 28, 224) 0 ['conv3_block4_0_bn[0][0]']
n)
conv3_block4_1_conv (Conv2D) (None, 28, 28, 128) 28672 ['conv3_block4_0_relu[0][0]']
conv3_block4_1_bn (BatchNormal (None, 28, 28, 128) 512 ['conv3_block4_1_conv[0][0]']
ization)
conv3_block4_1_relu (Activatio (None, 28, 28, 128) 0 ['conv3_block4_1_bn[0][0]']
n)
conv3_block4_2_conv (Conv2D) (None, 28, 28, 32) 36864 ['conv3_block4_1_relu[0][0]']
conv3_block4_concat (Concatena (None, 28, 28, 256) 0 ['conv3_block3_concat[0][0]',
te) 'conv3_block4_2_conv[0][0]']
conv3_block5_0_bn (BatchNormal (None, 28, 28, 256) 1024 ['conv3_block4_concat[0][0]']
ization)
conv3_block5_0_relu (Activatio (None, 28, 28, 256) 0 ['conv3_block5_0_bn[0][0]']
n)
conv3_block5_1_conv (Conv2D) (None, 28, 28, 128) 32768 ['conv3_block5_0_relu[0][0]']
conv3_block5_1_bn (BatchNormal (None, 28, 28, 128) 512 ['conv3_block5_1_conv[0][0]']
ization)
conv3_block5_1_relu (Activatio (None, 28, 28, 128) 0 ['conv3_block5_1_bn[0][0]']
n)
conv3_block5_2_conv (Conv2D) (None, 28, 28, 32) 36864 ['conv3_block5_1_relu[0][0]']
conv3_block5_concat (Concatena (None, 28, 28, 288) 0 ['conv3_block4_concat[0][0]',
te) 'conv3_block5_2_conv[0][0]']
conv3_block6_0_bn (BatchNormal (None, 28, 28, 288) 1152 ['conv3_block5_concat[0][0]']
ization)
conv3_block6_0_relu (Activatio (None, 28, 28, 288) 0 ['conv3_block6_0_bn[0][0]']
n)
conv3_block6_1_conv (Conv2D) (None, 28, 28, 128) 36864 ['conv3_block6_0_relu[0][0]']
conv3_block6_1_bn (BatchNormal (None, 28, 28, 128) 512 ['conv3_block6_1_conv[0][0]']
ization)
conv3_block6_1_relu (Activatio (None, 28, 28, 128) 0 ['conv3_block6_1_bn[0][0]']
n)
conv3_block6_2_conv (Conv2D) (None, 28, 28, 32) 36864 ['conv3_block6_1_relu[0][0]']
conv3_block6_concat (Concatena (None, 28, 28, 320) 0 ['conv3_block5_concat[0][0]',
te) 'conv3_block6_2_conv[0][0]']
conv3_block7_0_bn (BatchNormal (None, 28, 28, 320) 1280 ['conv3_block6_concat[0][0]']
ization)
conv3_block7_0_relu (Activatio (None, 28, 28, 320) 0 ['conv3_block7_0_bn[0][0]']
n)
conv3_block7_1_conv (Conv2D) (None, 28, 28, 128) 40960 ['conv3_block7_0_relu[0][0]']
conv3_block7_1_bn (BatchNormal (None, 28, 28, 128) 512 ['conv3_block7_1_conv[0][0]']
ization)
conv3_block7_1_relu (Activatio (None, 28, 28, 128) 0 ['conv3_block7_1_bn[0][0]']
n)
conv3_block7_2_conv (Conv2D) (None, 28, 28, 32) 36864 ['conv3_block7_1_relu[0][0]']
conv3_block7_concat (Concatena (None, 28, 28, 352) 0 ['conv3_block6_concat[0][0]',
te) 'conv3_block7_2_conv[0][0]']
conv3_block8_0_bn (BatchNormal (None, 28, 28, 352) 1408 ['conv3_block7_concat[0][0]']
ization)
conv3_block8_0_relu (Activatio (None, 28, 28, 352) 0 ['conv3_block8_0_bn[0][0]']
n)
conv3_block8_1_conv (Conv2D) (None, 28, 28, 128) 45056 ['conv3_block8_0_relu[0][0]']
conv3_block8_1_bn (BatchNormal (None, 28, 28, 128) 512 ['conv3_block8_1_conv[0][0]']
ization)
conv3_block8_1_relu (Activatio (None, 28, 28, 128) 0 ['conv3_block8_1_bn[0][0]']
n)
conv3_block8_2_conv (Conv2D) (None, 28, 28, 32) 36864 ['conv3_block8_1_relu[0][0]']
conv3_block8_concat (Concatena (None, 28, 28, 384) 0 ['conv3_block7_concat[0][0]',
te) 'conv3_block8_2_conv[0][0]']
conv3_block9_0_bn (BatchNormal (None, 28, 28, 384) 1536 ['conv3_block8_concat[0][0]']
ization)
conv3_block9_0_relu (Activatio (None, 28, 28, 384) 0 ['conv3_block9_0_bn[0][0]']
n)
conv3_block9_1_conv (Conv2D) (None, 28, 28, 128) 49152 ['conv3_block9_0_relu[0][0]']
conv3_block9_1_bn (BatchNormal (None, 28, 28, 128) 512 ['conv3_block9_1_conv[0][0]']
ization)
conv3_block9_1_relu (Activatio (None, 28, 28, 128) 0 ['conv3_block9_1_bn[0][0]']
n)
conv3_block9_2_conv (Conv2D) (None, 28, 28, 32) 36864 ['conv3_block9_1_relu[0][0]']
conv3_block9_concat (Concatena (None, 28, 28, 416) 0 ['conv3_block8_concat[0][0]',
te) 'conv3_block9_2_conv[0][0]']
conv3_block10_0_bn (BatchNorma (None, 28, 28, 416) 1664 ['conv3_block9_concat[0][0]']
lization)
conv3_block10_0_relu (Activati (None, 28, 28, 416) 0 ['conv3_block10_0_bn[0][0]']
on)
conv3_block10_1_conv (Conv2D) (None, 28, 28, 128) 53248 ['conv3_block10_0_relu[0][0]']
conv3_block10_1_bn (BatchNorma (None, 28, 28, 128) 512 ['conv3_block10_1_conv[0][0]']
lization)
conv3_block10_1_relu (Activati (None, 28, 28, 128) 0 ['conv3_block10_1_bn[0][0]']
on)
conv3_block10_2_conv (Conv2D) (None, 28, 28, 32) 36864 ['conv3_block10_1_relu[0][0]']
conv3_block10_concat (Concaten (None, 28, 28, 448) 0 ['conv3_block9_concat[0][0]',
ate) 'conv3_block10_2_conv[0][0]']
conv3_block11_0_bn (BatchNorma (None, 28, 28, 448) 1792 ['conv3_block10_concat[0][0]']
lization)
conv3_block11_0_relu (Activati (None, 28, 28, 448) 0 ['conv3_block11_0_bn[0][0]']
on)
conv3_block11_1_conv (Conv2D) (None, 28, 28, 128) 57344 ['conv3_block11_0_relu[0][0]']
conv3_block11_1_bn (BatchNorma (None, 28, 28, 128) 512 ['conv3_block11_1_conv[0][0]']
lization)
conv3_block11_1_relu (Activati (None, 28, 28, 128) 0 ['conv3_block11_1_bn[0][0]']
on)
conv3_block11_2_conv (Conv2D) (None, 28, 28, 32) 36864 ['conv3_block11_1_relu[0][0]']
conv3_block11_concat (Concaten (None, 28, 28, 480) 0 ['conv3_block10_concat[0][0]',
ate) 'conv3_block11_2_conv[0][0]']
conv3_block12_0_bn (BatchNorma (None, 28, 28, 480) 1920 ['conv3_block11_concat[0][0]']
lization)
conv3_block12_0_relu (Activati (None, 28, 28, 480) 0 ['conv3_block12_0_bn[0][0]']
on)
conv3_block12_1_conv (Conv2D) (None, 28, 28, 128) 61440 ['conv3_block12_0_relu[0][0]']
conv3_block12_1_bn (BatchNorma (None, 28, 28, 128) 512 ['conv3_block12_1_conv[0][0]']
lization)
conv3_block12_1_relu (Activati (None, 28, 28, 128) 0 ['conv3_block12_1_bn[0][0]']
on)
conv3_block12_2_conv (Conv2D) (None, 28, 28, 32) 36864 ['conv3_block12_1_relu[0][0]']
conv3_block12_concat (Concaten (None, 28, 28, 512) 0 ['conv3_block11_concat[0][0]',
ate) 'conv3_block12_2_conv[0][0]']
pool3_bn (BatchNormalization) (None, 28, 28, 512) 2048 ['conv3_block12_concat[0][0]']
pool3_relu (Activation) (None, 28, 28, 512) 0 ['pool3_bn[0][0]']
pool3_conv (Conv2D) (None, 28, 28, 256) 131072 ['pool3_relu[0][0]']
pool3_pool (AveragePooling2D) (None, 14, 14, 256) 0 ['pool3_conv[0][0]']
conv4_block1_0_bn (BatchNormal (None, 14, 14, 256) 1024 ['pool3_pool[0][0]']
ization)
conv4_block1_0_relu (Activatio (None, 14, 14, 256) 0 ['conv4_block1_0_bn[0][0]']
n)
conv4_block1_1_conv (Conv2D) (None, 14, 14, 128) 32768 ['conv4_block1_0_relu[0][0]']
conv4_block1_1_bn (BatchNormal (None, 14, 14, 128) 512 ['conv4_block1_1_conv[0][0]']
ization)
conv4_block1_1_relu (Activatio (None, 14, 14, 128) 0 ['conv4_block1_1_bn[0][0]']
n)
conv4_block1_2_conv (Conv2D) (None, 14, 14, 32) 36864 ['conv4_block1_1_relu[0][0]']
conv4_block1_concat (Concatena (None, 14, 14, 288) 0 ['pool3_pool[0][0]',
te) 'conv4_block1_2_conv[0][0]']
conv4_block2_0_bn (BatchNormal (None, 14, 14, 288) 1152 ['conv4_block1_concat[0][0]']
ization)
conv4_block2_0_relu (Activatio (None, 14, 14, 288) 0 ['conv4_block2_0_bn[0][0]']
n)
conv4_block2_1_conv (Conv2D) (None, 14, 14, 128) 36864 ['conv4_block2_0_relu[0][0]']
conv4_block2_1_bn (BatchNormal (None, 14, 14, 128) 512 ['conv4_block2_1_conv[0][0]']
ization)
conv4_block2_1_relu (Activatio (None, 14, 14, 128) 0 ['conv4_block2_1_bn[0][0]']
n)
conv4_block2_2_conv (Conv2D) (None, 14, 14, 32) 36864 ['conv4_block2_1_relu[0][0]']
conv4_block2_concat (Concatena (None, 14, 14, 320) 0 ['conv4_block1_concat[0][0]',
te) 'conv4_block2_2_conv[0][0]']
conv4_block3_0_bn (BatchNormal (None, 14, 14, 320) 1280 ['conv4_block2_concat[0][0]']
ization)
conv4_block3_0_relu (Activatio (None, 14, 14, 320) 0 ['conv4_block3_0_bn[0][0]']
n)
conv4_block3_1_conv (Conv2D) (None, 14, 14, 128) 40960 ['conv4_block3_0_relu[0][0]']
conv4_block3_1_bn (BatchNormal (None, 14, 14, 128) 512 ['conv4_block3_1_conv[0][0]']
ization)
conv4_block3_1_relu (Activatio (None, 14, 14, 128) 0 ['conv4_block3_1_bn[0][0]']
n)
conv4_block3_2_conv (Conv2D) (None, 14, 14, 32) 36864 ['conv4_block3_1_relu[0][0]']
conv4_block3_concat (Concatena (None, 14, 14, 352) 0 ['conv4_block2_concat[0][0]',
te) 'conv4_block3_2_conv[0][0]']
conv4_block4_0_bn (BatchNormal (None, 14, 14, 352) 1408 ['conv4_block3_concat[0][0]']
ization)
conv4_block4_0_relu (Activatio (None, 14, 14, 352) 0 ['conv4_block4_0_bn[0][0]']
n)
conv4_block4_1_conv (Conv2D) (None, 14, 14, 128) 45056 ['conv4_block4_0_relu[0][0]']
conv4_block4_1_bn (BatchNormal (None, 14, 14, 128) 512 ['conv4_block4_1_conv[0][0]']
ization)
conv4_block4_1_relu (Activatio (None, 14, 14, 128) 0 ['conv4_block4_1_bn[0][0]']
n)
conv4_block4_2_conv (Conv2D) (None, 14, 14, 32) 36864 ['conv4_block4_1_relu[0][0]']
conv4_block4_concat (Concatena (None, 14, 14, 384) 0 ['conv4_block3_concat[0][0]',
te) 'conv4_block4_2_conv[0][0]']
conv4_block5_0_bn (BatchNormal (None, 14, 14, 384) 1536 ['conv4_block4_concat[0][0]']
ization)
conv4_block5_0_relu (Activatio (None, 14, 14, 384) 0 ['conv4_block5_0_bn[0][0]']
n)
conv4_block5_1_conv (Conv2D) (None, 14, 14, 128) 49152 ['conv4_block5_0_relu[0][0]']
conv4_block5_1_bn (BatchNormal (None, 14, 14, 128) 512 ['conv4_block5_1_conv[0][0]']
ization)
conv4_block5_1_relu (Activatio (None, 14, 14, 128) 0 ['conv4_block5_1_bn[0][0]']
n)
conv4_block5_2_conv (Conv2D) (None, 14, 14, 32) 36864 ['conv4_block5_1_relu[0][0]']
conv4_block5_concat (Concatena (None, 14, 14, 416) 0 ['conv4_block4_concat[0][0]',
te) 'conv4_block5_2_conv[0][0]']
conv4_block6_0_bn (BatchNormal (None, 14, 14, 416) 1664 ['conv4_block5_concat[0][0]']
ization)
conv4_block6_0_relu (Activatio (None, 14, 14, 416) 0 ['conv4_block6_0_bn[0][0]']
n)
conv4_block6_1_conv (Conv2D) (None, 14, 14, 128) 53248 ['conv4_block6_0_relu[0][0]']
conv4_block6_1_bn (BatchNormal (None, 14, 14, 128) 512 ['conv4_block6_1_conv[0][0]']
ization)
conv4_block6_1_relu (Activatio (None, 14, 14, 128) 0 ['conv4_block6_1_bn[0][0]']
n)
conv4_block6_2_conv (Conv2D) (None, 14, 14, 32) 36864 ['conv4_block6_1_relu[0][0]']
conv4_block6_concat (Concatena (None, 14, 14, 448) 0 ['conv4_block5_concat[0][0]',
te) 'conv4_block6_2_conv[0][0]']
conv4_block7_0_bn (BatchNormal (None, 14, 14, 448) 1792 ['conv4_block6_concat[0][0]']
ization)
conv4_block7_0_relu (Activatio (None, 14, 14, 448) 0 ['conv4_block7_0_bn[0][0]']
n)
conv4_block7_1_conv (Conv2D) (None, 14, 14, 128) 57344 ['conv4_block7_0_relu[0][0]']
conv4_block7_1_bn (BatchNormal (None, 14, 14, 128) 512 ['conv4_block7_1_conv[0][0]']
ization)
conv4_block7_1_relu (Activatio (None, 14, 14, 128) 0 ['conv4_block7_1_bn[0][0]']
n)
conv4_block7_2_conv (Conv2D) (None, 14, 14, 32) 36864 ['conv4_block7_1_relu[0][0]']
conv4_block7_concat (Concatena (None, 14, 14, 480) 0 ['conv4_block6_concat[0][0]',
te) 'conv4_block7_2_conv[0][0]']
conv4_block8_0_bn (BatchNormal (None, 14, 14, 480) 1920 ['conv4_block7_concat[0][0]']
ization)
conv4_block8_0_relu (Activatio (None, 14, 14, 480) 0 ['conv4_block8_0_bn[0][0]']
n)
conv4_block8_1_conv (Conv2D) (None, 14, 14, 128) 61440 ['conv4_block8_0_relu[0][0]']
conv4_block8_1_bn (BatchNormal (None, 14, 14, 128) 512 ['conv4_block8_1_conv[0][0]']
ization)
conv4_block8_1_relu (Activatio (None, 14, 14, 128) 0 ['conv4_block8_1_bn[0][0]']
n)
conv4_block8_2_conv (Conv2D) (None, 14, 14, 32) 36864 ['conv4_block8_1_relu[0][0]']
conv4_block8_concat (Concatena (None, 14, 14, 512) 0 ['conv4_block7_concat[0][0]',
te) 'conv4_block8_2_conv[0][0]']
conv4_block9_0_bn (BatchNormal (None, 14, 14, 512) 2048 ['conv4_block8_concat[0][0]']
ization)
conv4_block9_0_relu (Activatio (None, 14, 14, 512) 0 ['conv4_block9_0_bn[0][0]']
n)
conv4_block9_1_conv (Conv2D) (None, 14, 14, 128) 65536 ['conv4_block9_0_relu[0][0]']
conv4_block9_1_bn (BatchNormal (None, 14, 14, 128) 512 ['conv4_block9_1_conv[0][0]']
ization)
conv4_block9_1_relu (Activatio (None, 14, 14, 128) 0 ['conv4_block9_1_bn[0][0]']
n)
conv4_block9_2_conv (Conv2D) (None, 14, 14, 32) 36864 ['conv4_block9_1_relu[0][0]']
conv4_block9_concat (Concatena (None, 14, 14, 544) 0 ['conv4_block8_concat[0][0]',
te) 'conv4_block9_2_conv[0][0]']
conv4_block10_0_bn (BatchNorma (None, 14, 14, 544) 2176 ['conv4_block9_concat[0][0]']
lization)
conv4_block10_0_relu (Activati (None, 14, 14, 544) 0 ['conv4_block10_0_bn[0][0]']
on)
conv4_block10_1_conv (Conv2D) (None, 14, 14, 128) 69632 ['conv4_block10_0_relu[0][0]']
conv4_block10_1_bn (BatchNorma (None, 14, 14, 128) 512 ['conv4_block10_1_conv[0][0]']
lization)
conv4_block10_1_relu (Activati (None, 14, 14, 128) 0 ['conv4_block10_1_bn[0][0]']
on)
conv4_block10_2_conv (Conv2D) (None, 14, 14, 32) 36864 ['conv4_block10_1_relu[0][0]']
conv4_block10_concat (Concaten (None, 14, 14, 576) 0 ['conv4_block9_concat[0][0]',
ate) 'conv4_block10_2_conv[0][0]']
conv4_block11_0_bn (BatchNorma (None, 14, 14, 576) 2304 ['conv4_block10_concat[0][0]']
lization)
conv4_block11_0_relu (Activati (None, 14, 14, 576) 0 ['conv4_block11_0_bn[0][0]']
on)
conv4_block11_1_conv (Conv2D) (None, 14, 14, 128) 73728 ['conv4_block11_0_relu[0][0]']
conv4_block11_1_bn (BatchNorma (None, 14, 14, 128) 512 ['conv4_block11_1_conv[0][0]']
lization)
conv4_block11_1_relu (Activati (None, 14, 14, 128) 0 ['conv4_block11_1_bn[0][0]']
on)
conv4_block11_2_conv (Conv2D) (None, 14, 14, 32) 36864 ['conv4_block11_1_relu[0][0]']
conv4_block11_concat (Concaten (None, 14, 14, 608) 0 ['conv4_block10_concat[0][0]',
ate) 'conv4_block11_2_conv[0][0]']
conv4_block12_0_bn (BatchNorma (None, 14, 14, 608) 2432 ['conv4_block11_concat[0][0]']
lization)
conv4_block12_0_relu (Activati (None, 14, 14, 608) 0 ['conv4_block12_0_bn[0][0]']
on)
conv4_block12_1_conv (Conv2D) (None, 14, 14, 128) 77824 ['conv4_block12_0_relu[0][0]']
conv4_block12_1_bn (BatchNorma (None, 14, 14, 128) 512 ['conv4_block12_1_conv[0][0]']
lization)
conv4_block12_1_relu (Activati (None, 14, 14, 128) 0 ['conv4_block12_1_bn[0][0]']
on)
conv4_block12_2_conv (Conv2D) (None, 14, 14, 32) 36864 ['conv4_block12_1_relu[0][0]']
conv4_block12_concat (Concaten (None, 14, 14, 640) 0 ['conv4_block11_concat[0][0]',
ate) 'conv4_block12_2_conv[0][0]']
conv4_block13_0_bn (BatchNorma (None, 14, 14, 640) 2560 ['conv4_block12_concat[0][0]']
lization)
conv4_block13_0_relu (Activati (None, 14, 14, 640) 0 ['conv4_block13_0_bn[0][0]']
on)
conv4_block13_1_conv (Conv2D) (None, 14, 14, 128) 81920 ['conv4_block13_0_relu[0][0]']
conv4_block13_1_bn (BatchNorma (None, 14, 14, 128) 512 ['conv4_block13_1_conv[0][0]']
lization)
conv4_block13_1_relu (Activati (None, 14, 14, 128) 0 ['conv4_block13_1_bn[0][0]']
on)
conv4_block13_2_conv (Conv2D) (None, 14, 14, 32) 36864 ['conv4_block13_1_relu[0][0]']
conv4_block13_concat (Concaten (None, 14, 14, 672) 0 ['conv4_block12_concat[0][0]',
ate) 'conv4_block13_2_conv[0][0]']
conv4_block14_0_bn (BatchNorma (None, 14, 14, 672) 2688 ['conv4_block13_concat[0][0]']
lization)
conv4_block14_0_relu (Activati (None, 14, 14, 672) 0 ['conv4_block14_0_bn[0][0]']
on)
conv4_block14_1_conv (Conv2D) (None, 14, 14, 128) 86016 ['conv4_block14_0_relu[0][0]']
conv4_block14_1_bn (BatchNorma (None, 14, 14, 128) 512 ['conv4_block14_1_conv[0][0]']
lization)
conv4_block14_1_relu (Activati (None, 14, 14, 128) 0 ['conv4_block14_1_bn[0][0]']
on)
conv4_block14_2_conv (Conv2D) (None, 14, 14, 32) 36864 ['conv4_block14_1_relu[0][0]']
conv4_block14_concat (Concaten (None, 14, 14, 704) 0 ['conv4_block13_concat[0][0]',
ate) 'conv4_block14_2_conv[0][0]']
conv4_block15_0_bn (BatchNorma (None, 14, 14, 704) 2816 ['conv4_block14_concat[0][0]']
lization)
conv4_block15_0_relu (Activati (None, 14, 14, 704) 0 ['conv4_block15_0_bn[0][0]']
on)
conv4_block15_1_conv (Conv2D) (None, 14, 14, 128) 90112 ['conv4_block15_0_relu[0][0]']
conv4_block15_1_bn (BatchNorma (None, 14, 14, 128) 512 ['conv4_block15_1_conv[0][0]']
lization)
conv4_block15_1_relu (Activati (None, 14, 14, 128) 0 ['conv4_block15_1_bn[0][0]']
on)
conv4_block15_2_conv (Conv2D) (None, 14, 14, 32) 36864 ['conv4_block15_1_relu[0][0]']
conv4_block15_concat (Concaten (None, 14, 14, 736) 0 ['conv4_block14_concat[0][0]',
ate) 'conv4_block15_2_conv[0][0]']
conv4_block16_0_bn (BatchNorma (None, 14, 14, 736) 2944 ['conv4_block15_concat[0][0]']
lization)
conv4_block16_0_relu (Activati (None, 14, 14, 736) 0 ['conv4_block16_0_bn[0][0]']
on)
conv4_block16_1_conv (Conv2D) (None, 14, 14, 128) 94208 ['conv4_block16_0_relu[0][0]']
conv4_block16_1_bn (BatchNorma (None, 14, 14, 128) 512 ['conv4_block16_1_conv[0][0]']
lization)
conv4_block16_1_relu (Activati (None, 14, 14, 128) 0 ['conv4_block16_1_bn[0][0]']
on)
conv4_block16_2_conv (Conv2D) (None, 14, 14, 32) 36864 ['conv4_block16_1_relu[0][0]']
conv4_block16_concat (Concaten (None, 14, 14, 768) 0 ['conv4_block15_concat[0][0]',
ate) 'conv4_block16_2_conv[0][0]']
conv4_block17_0_bn (BatchNorma (None, 14, 14, 768) 3072 ['conv4_block16_concat[0][0]']
lization)
conv4_block17_0_relu (Activati (None, 14, 14, 768) 0 ['conv4_block17_0_bn[0][0]']
on)
conv4_block17_1_conv (Conv2D) (None, 14, 14, 128) 98304 ['conv4_block17_0_relu[0][0]']
conv4_block17_1_bn (BatchNorma (None, 14, 14, 128) 512 ['conv4_block17_1_conv[0][0]']
lization)
conv4_block17_1_relu (Activati (None, 14, 14, 128) 0 ['conv4_block17_1_bn[0][0]']
on)
conv4_block17_2_conv (Conv2D) (None, 14, 14, 32) 36864 ['conv4_block17_1_relu[0][0]']
conv4_block17_concat (Concaten (None, 14, 14, 800) 0 ['conv4_block16_concat[0][0]',
ate) 'conv4_block17_2_conv[0][0]']
conv4_block18_0_bn (BatchNorma (None, 14, 14, 800) 3200 ['conv4_block17_concat[0][0]']
lization)
conv4_block18_0_relu (Activati (None, 14, 14, 800) 0 ['conv4_block18_0_bn[0][0]']
on)
conv4_block18_1_conv (Conv2D) (None, 14, 14, 128) 102400 ['conv4_block18_0_relu[0][0]']
conv4_block18_1_bn (BatchNorma (None, 14, 14, 128) 512 ['conv4_block18_1_conv[0][0]']
lization)
conv4_block18_1_relu (Activati (None, 14, 14, 128) 0 ['conv4_block18_1_bn[0][0]']
on)
conv4_block18_2_conv (Conv2D) (None, 14, 14, 32) 36864 ['conv4_block18_1_relu[0][0]']
conv4_block18_concat (Concaten (None, 14, 14, 832) 0 ['conv4_block17_concat[0][0]',
ate) 'conv4_block18_2_conv[0][0]']
conv4_block19_0_bn (BatchNorma (None, 14, 14, 832) 3328 ['conv4_block18_concat[0][0]']
lization)
conv4_block19_0_relu (Activati (None, 14, 14, 832) 0 ['conv4_block19_0_bn[0][0]']
on)
conv4_block19_1_conv (Conv2D) (None, 14, 14, 128) 106496 ['conv4_block19_0_relu[0][0]']
conv4_block19_1_bn (BatchNorma (None, 14, 14, 128) 512 ['conv4_block19_1_conv[0][0]']
lization)
conv4_block19_1_relu (Activati (None, 14, 14, 128) 0 ['conv4_block19_1_bn[0][0]']
on)
conv4_block19_2_conv (Conv2D) (None, 14, 14, 32) 36864 ['conv4_block19_1_relu[0][0]']
conv4_block19_concat (Concaten (None, 14, 14, 864) 0 ['conv4_block18_concat[0][0]',
ate) 'conv4_block19_2_conv[0][0]']
conv4_block20_0_bn (BatchNorma (None, 14, 14, 864) 3456 ['conv4_block19_concat[0][0]']
lization)
conv4_block20_0_relu (Activati (None, 14, 14, 864) 0 ['conv4_block20_0_bn[0][0]']
on)
conv4_block20_1_conv (Conv2D) (None, 14, 14, 128) 110592 ['conv4_block20_0_relu[0][0]']
conv4_block20_1_bn (BatchNorma (None, 14, 14, 128) 512 ['conv4_block20_1_conv[0][0]']
lization)
conv4_block20_1_relu (Activati (None, 14, 14, 128) 0 ['conv4_block20_1_bn[0][0]']
on)
conv4_block20_2_conv (Conv2D) (None, 14, 14, 32) 36864 ['conv4_block20_1_relu[0][0]']
conv4_block20_concat (Concaten (None, 14, 14, 896) 0 ['conv4_block19_concat[0][0]',
ate) 'conv4_block20_2_conv[0][0]']
conv4_block21_0_bn (BatchNorma (None, 14, 14, 896) 3584 ['conv4_block20_concat[0][0]']
lization)
conv4_block21_0_relu (Activati (None, 14, 14, 896) 0 ['conv4_block21_0_bn[0][0]']
on)
conv4_block21_1_conv (Conv2D) (None, 14, 14, 128) 114688 ['conv4_block21_0_relu[0][0]']
conv4_block21_1_bn (BatchNorma (None, 14, 14, 128) 512 ['conv4_block21_1_conv[0][0]']
lization)
conv4_block21_1_relu (Activati (None, 14, 14, 128) 0 ['conv4_block21_1_bn[0][0]']
on)
conv4_block21_2_conv (Conv2D) (None, 14, 14, 32) 36864 ['conv4_block21_1_relu[0][0]']
conv4_block21_concat (Concaten (None, 14, 14, 928) 0 ['conv4_block20_concat[0][0]',
ate) 'conv4_block21_2_conv[0][0]']
conv4_block22_0_bn (BatchNorma (None, 14, 14, 928) 3712 ['conv4_block21_concat[0][0]']
lization)
conv4_block22_0_relu (Activati (None, 14, 14, 928) 0 ['conv4_block22_0_bn[0][0]']
on)
conv4_block22_1_conv (Conv2D) (None, 14, 14, 128) 118784 ['conv4_block22_0_relu[0][0]']
conv4_block22_1_bn (BatchNorma (None, 14, 14, 128) 512 ['conv4_block22_1_conv[0][0]']
lization)
conv4_block22_1_relu (Activati (None, 14, 14, 128) 0 ['conv4_block22_1_bn[0][0]']
on)
conv4_block22_2_conv (Conv2D) (None, 14, 14, 32) 36864 ['conv4_block22_1_relu[0][0]']
conv4_block22_concat (Concaten (None, 14, 14, 960) 0 ['conv4_block21_concat[0][0]',
ate) 'conv4_block22_2_conv[0][0]']
conv4_block23_0_bn (BatchNorma (None, 14, 14, 960) 3840 ['conv4_block22_concat[0][0]']
lization)
conv4_block23_0_relu (Activati (None, 14, 14, 960) 0 ['conv4_block23_0_bn[0][0]']
on)
conv4_block23_1_conv (Conv2D) (None, 14, 14, 128) 122880 ['conv4_block23_0_relu[0][0]']
conv4_block23_1_bn (BatchNorma (None, 14, 14, 128) 512 ['conv4_block23_1_conv[0][0]']
lization)
conv4_block23_1_relu (Activati (None, 14, 14, 128) 0 ['conv4_block23_1_bn[0][0]']
on)
conv4_block23_2_conv (Conv2D) (None, 14, 14, 32) 36864 ['conv4_block23_1_relu[0][0]']
conv4_block23_concat (Concaten (None, 14, 14, 992) 0 ['conv4_block22_concat[0][0]',
ate) 'conv4_block23_2_conv[0][0]']
conv4_block24_0_bn (BatchNorma (None, 14, 14, 992) 3968 ['conv4_block23_concat[0][0]']
lization)
conv4_block24_0_relu (Activati (None, 14, 14, 992) 0 ['conv4_block24_0_bn[0][0]']
on)
conv4_block24_1_conv (Conv2D) (None, 14, 14, 128) 126976 ['conv4_block24_0_relu[0][0]']
conv4_block24_1_bn (BatchNorma (None, 14, 14, 128) 512 ['conv4_block24_1_conv[0][0]']
lization)
conv4_block24_1_relu (Activati (None, 14, 14, 128) 0 ['conv4_block24_1_bn[0][0]']
on)
conv4_block24_2_conv (Conv2D) (None, 14, 14, 32) 36864 ['conv4_block24_1_relu[0][0]']
conv4_block24_concat (Concaten (None, 14, 14, 1024 0 ['conv4_block23_concat[0][0]',
ate) ) 'conv4_block24_2_conv[0][0]']
pool4_bn (BatchNormalization) (None, 14, 14, 1024 4096 ['conv4_block24_concat[0][0]']
)
pool4_relu (Activation) (None, 14, 14, 1024 0 ['pool4_bn[0][0]']
)
pool4_conv (Conv2D) (None, 14, 14, 512) 524288 ['pool4_relu[0][0]']
pool4_pool (AveragePooling2D) (None, 7, 7, 512) 0 ['pool4_conv[0][0]']
conv5_block1_0_bn (BatchNormal (None, 7, 7, 512) 2048 ['pool4_pool[0][0]']
ization)
conv5_block1_0_relu (Activatio (None, 7, 7, 512) 0 ['conv5_block1_0_bn[0][0]']
n)
conv5_block1_1_conv (Conv2D) (None, 7, 7, 128) 65536 ['conv5_block1_0_relu[0][0]']
conv5_block1_1_bn (BatchNormal (None, 7, 7, 128) 512 ['conv5_block1_1_conv[0][0]']
ization)
conv5_block1_1_relu (Activatio (None, 7, 7, 128) 0 ['conv5_block1_1_bn[0][0]']
n)
conv5_block1_2_conv (Conv2D) (None, 7, 7, 32) 36864 ['conv5_block1_1_relu[0][0]']
conv5_block1_concat (Concatena (None, 7, 7, 544) 0 ['pool4_pool[0][0]',
te) 'conv5_block1_2_conv[0][0]']
conv5_block2_0_bn (BatchNormal (None, 7, 7, 544) 2176 ['conv5_block1_concat[0][0]']
ization)
conv5_block2_0_relu (Activatio (None, 7, 7, 544) 0 ['conv5_block2_0_bn[0][0]']
n)
conv5_block2_1_conv (Conv2D) (None, 7, 7, 128) 69632 ['conv5_block2_0_relu[0][0]']
conv5_block2_1_bn (BatchNormal (None, 7, 7, 128) 512 ['conv5_block2_1_conv[0][0]']
ization)
conv5_block2_1_relu (Activatio (None, 7, 7, 128) 0 ['conv5_block2_1_bn[0][0]']
n)
conv5_block2_2_conv (Conv2D) (None, 7, 7, 32) 36864 ['conv5_block2_1_relu[0][0]']
conv5_block2_concat (Concatena (None, 7, 7, 576) 0 ['conv5_block1_concat[0][0]',
te) 'conv5_block2_2_conv[0][0]']
conv5_block3_0_bn (BatchNormal (None, 7, 7, 576) 2304 ['conv5_block2_concat[0][0]']
ization)
conv5_block3_0_relu (Activatio (None, 7, 7, 576) 0 ['conv5_block3_0_bn[0][0]']
n)
conv5_block3_1_conv (Conv2D) (None, 7, 7, 128) 73728 ['conv5_block3_0_relu[0][0]']
conv5_block3_1_bn (BatchNormal (None, 7, 7, 128) 512 ['conv5_block3_1_conv[0][0]']
ization)
conv5_block3_1_relu (Activatio (None, 7, 7, 128) 0 ['conv5_block3_1_bn[0][0]']
n)
conv5_block3_2_conv (Conv2D) (None, 7, 7, 32) 36864 ['conv5_block3_1_relu[0][0]']
conv5_block3_concat (Concatena (None, 7, 7, 608) 0 ['conv5_block2_concat[0][0]',
te) 'conv5_block3_2_conv[0][0]']
conv5_block4_0_bn (BatchNormal (None, 7, 7, 608) 2432 ['conv5_block3_concat[0][0]']
ization)
conv5_block4_0_relu (Activatio (None, 7, 7, 608) 0 ['conv5_block4_0_bn[0][0]']
n)
conv5_block4_1_conv (Conv2D) (None, 7, 7, 128) 77824 ['conv5_block4_0_relu[0][0]']
conv5_block4_1_bn (BatchNormal (None, 7, 7, 128) 512 ['conv5_block4_1_conv[0][0]']
ization)
conv5_block4_1_relu (Activatio (None, 7, 7, 128) 0 ['conv5_block4_1_bn[0][0]']
n)
conv5_block4_2_conv (Conv2D) (None, 7, 7, 32) 36864 ['conv5_block4_1_relu[0][0]']
conv5_block4_concat (Concatena (None, 7, 7, 640) 0 ['conv5_block3_concat[0][0]',
te) 'conv5_block4_2_conv[0][0]']
conv5_block5_0_bn (BatchNormal (None, 7, 7, 640) 2560 ['conv5_block4_concat[0][0]']
ization)
conv5_block5_0_relu (Activatio (None, 7, 7, 640) 0 ['conv5_block5_0_bn[0][0]']
n)
conv5_block5_1_conv (Conv2D) (None, 7, 7, 128) 81920 ['conv5_block5_0_relu[0][0]']
conv5_block5_1_bn (BatchNormal (None, 7, 7, 128) 512 ['conv5_block5_1_conv[0][0]']
ization)
conv5_block5_1_relu (Activatio (None, 7, 7, 128) 0 ['conv5_block5_1_bn[0][0]']
n)
conv5_block5_2_conv (Conv2D) (None, 7, 7, 32) 36864 ['conv5_block5_1_relu[0][0]']
conv5_block5_concat (Concatena (None, 7, 7, 672) 0 ['conv5_block4_concat[0][0]',
te) 'conv5_block5_2_conv[0][0]']
conv5_block6_0_bn (BatchNormal (None, 7, 7, 672) 2688 ['conv5_block5_concat[0][0]']
ization)
conv5_block6_0_relu (Activatio (None, 7, 7, 672) 0 ['conv5_block6_0_bn[0][0]']
n)
conv5_block6_1_conv (Conv2D) (None, 7, 7, 128) 86016 ['conv5_block6_0_relu[0][0]']
conv5_block6_1_bn (BatchNormal (None, 7, 7, 128) 512 ['conv5_block6_1_conv[0][0]']
ization)
conv5_block6_1_relu (Activatio (None, 7, 7, 128) 0 ['conv5_block6_1_bn[0][0]']
n)
conv5_block6_2_conv (Conv2D) (None, 7, 7, 32) 36864 ['conv5_block6_1_relu[0][0]']
conv5_block6_concat (Concatena (None, 7, 7, 704) 0 ['conv5_block5_concat[0][0]',
te) 'conv5_block6_2_conv[0][0]']
conv5_block7_0_bn (BatchNormal (None, 7, 7, 704) 2816 ['conv5_block6_concat[0][0]']
ization)
conv5_block7_0_relu (Activatio (None, 7, 7, 704) 0 ['conv5_block7_0_bn[0][0]']
n)
conv5_block7_1_conv (Conv2D) (None, 7, 7, 128) 90112 ['conv5_block7_0_relu[0][0]']
conv5_block7_1_bn (BatchNormal (None, 7, 7, 128) 512 ['conv5_block7_1_conv[0][0]']
ization)
conv5_block7_1_relu (Activatio (None, 7, 7, 128) 0 ['conv5_block7_1_bn[0][0]']
n)
conv5_block7_2_conv (Conv2D) (None, 7, 7, 32) 36864 ['conv5_block7_1_relu[0][0]']
conv5_block7_concat (Concatena (None, 7, 7, 736) 0 ['conv5_block6_concat[0][0]',
te) 'conv5_block7_2_conv[0][0]']
conv5_block8_0_bn (BatchNormal (None, 7, 7, 736) 2944 ['conv5_block7_concat[0][0]']
ization)
conv5_block8_0_relu (Activatio (None, 7, 7, 736) 0 ['conv5_block8_0_bn[0][0]']
n)
conv5_block8_1_conv (Conv2D) (None, 7, 7, 128) 94208 ['conv5_block8_0_relu[0][0]']
conv5_block8_1_bn (BatchNormal (None, 7, 7, 128) 512 ['conv5_block8_1_conv[0][0]']
ization)
conv5_block8_1_relu (Activatio (None, 7, 7, 128) 0 ['conv5_block8_1_bn[0][0]']
n)
conv5_block8_2_conv (Conv2D) (None, 7, 7, 32) 36864 ['conv5_block8_1_relu[0][0]']
conv5_block8_concat (Concatena (None, 7, 7, 768) 0 ['conv5_block7_concat[0][0]',
te) 'conv5_block8_2_conv[0][0]']
conv5_block9_0_bn (BatchNormal (None, 7, 7, 768) 3072 ['conv5_block8_concat[0][0]']
ization)
conv5_block9_0_relu (Activatio (None, 7, 7, 768) 0 ['conv5_block9_0_bn[0][0]']
n)
conv5_block9_1_conv (Conv2D) (None, 7, 7, 128) 98304 ['conv5_block9_0_relu[0][0]']
conv5_block9_1_bn (BatchNormal (None, 7, 7, 128) 512 ['conv5_block9_1_conv[0][0]']
ization)
conv5_block9_1_relu (Activatio (None, 7, 7, 128) 0 ['conv5_block9_1_bn[0][0]']
n)
conv5_block9_2_conv (Conv2D) (None, 7, 7, 32) 36864 ['conv5_block9_1_relu[0][0]']
conv5_block9_concat (Concatena (None, 7, 7, 800) 0 ['conv5_block8_concat[0][0]',
te) 'conv5_block9_2_conv[0][0]']
conv5_block10_0_bn (BatchNorma (None, 7, 7, 800) 3200 ['conv5_block9_concat[0][0]']
lization)
conv5_block10_0_relu (Activati (None, 7, 7, 800) 0 ['conv5_block10_0_bn[0][0]']
on)
conv5_block10_1_conv (Conv2D) (None, 7, 7, 128) 102400 ['conv5_block10_0_relu[0][0]']
conv5_block10_1_bn (BatchNorma (None, 7, 7, 128) 512 ['conv5_block10_1_conv[0][0]']
lization)
conv5_block10_1_relu (Activati (None, 7, 7, 128) 0 ['conv5_block10_1_bn[0][0]']
on)
conv5_block10_2_conv (Conv2D) (None, 7, 7, 32) 36864 ['conv5_block10_1_relu[0][0]']
conv5_block10_concat (Concaten (None, 7, 7, 832) 0 ['conv5_block9_concat[0][0]',
ate) 'conv5_block10_2_conv[0][0]']
conv5_block11_0_bn (BatchNorma (None, 7, 7, 832) 3328 ['conv5_block10_concat[0][0]']
lization)
conv5_block11_0_relu (Activati (None, 7, 7, 832) 0 ['conv5_block11_0_bn[0][0]']
on)
conv5_block11_1_conv (Conv2D) (None, 7, 7, 128) 106496 ['conv5_block11_0_relu[0][0]']
conv5_block11_1_bn (BatchNorma (None, 7, 7, 128) 512 ['conv5_block11_1_conv[0][0]']
lization)
conv5_block11_1_relu (Activati (None, 7, 7, 128) 0 ['conv5_block11_1_bn[0][0]']
on)
conv5_block11_2_conv (Conv2D) (None, 7, 7, 32) 36864 ['conv5_block11_1_relu[0][0]']
conv5_block11_concat (Concaten (None, 7, 7, 864) 0 ['conv5_block10_concat[0][0]',
ate) 'conv5_block11_2_conv[0][0]']
conv5_block12_0_bn (BatchNorma (None, 7, 7, 864) 3456 ['conv5_block11_concat[0][0]']
lization)
conv5_block12_0_relu (Activati (None, 7, 7, 864) 0 ['conv5_block12_0_bn[0][0]']
on)
conv5_block12_1_conv (Conv2D) (None, 7, 7, 128) 110592 ['conv5_block12_0_relu[0][0]']
conv5_block12_1_bn (BatchNorma (None, 7, 7, 128) 512 ['conv5_block12_1_conv[0][0]']
lization)
conv5_block12_1_relu (Activati (None, 7, 7, 128) 0 ['conv5_block12_1_bn[0][0]']
on)
conv5_block12_2_conv (Conv2D) (None, 7, 7, 32) 36864 ['conv5_block12_1_relu[0][0]']
conv5_block12_concat (Concaten (None, 7, 7, 896) 0 ['conv5_block11_concat[0][0]',
ate) 'conv5_block12_2_conv[0][0]']
conv5_block13_0_bn (BatchNorma (None, 7, 7, 896) 3584 ['conv5_block12_concat[0][0]']
lization)
conv5_block13_0_relu (Activati (None, 7, 7, 896) 0 ['conv5_block13_0_bn[0][0]']
on)
conv5_block13_1_conv (Conv2D) (None, 7, 7, 128) 114688 ['conv5_block13_0_relu[0][0]']
conv5_block13_1_bn (BatchNorma (None, 7, 7, 128) 512 ['conv5_block13_1_conv[0][0]']
lization)
conv5_block13_1_relu (Activati (None, 7, 7, 128) 0 ['conv5_block13_1_bn[0][0]']
on)
conv5_block13_2_conv (Conv2D) (None, 7, 7, 32) 36864 ['conv5_block13_1_relu[0][0]']
conv5_block13_concat (Concaten (None, 7, 7, 928) 0 ['conv5_block12_concat[0][0]',
ate) 'conv5_block13_2_conv[0][0]']
conv5_block14_0_bn (BatchNorma (None, 7, 7, 928) 3712 ['conv5_block13_concat[0][0]']
lization)
conv5_block14_0_relu (Activati (None, 7, 7, 928) 0 ['conv5_block14_0_bn[0][0]']
on)
conv5_block14_1_conv (Conv2D) (None, 7, 7, 128) 118784 ['conv5_block14_0_relu[0][0]']
conv5_block14_1_bn (BatchNorma (None, 7, 7, 128) 512 ['conv5_block14_1_conv[0][0]']
lization)
conv5_block14_1_relu (Activati (None, 7, 7, 128) 0 ['conv5_block14_1_bn[0][0]']
on)
conv5_block14_2_conv (Conv2D) (None, 7, 7, 32) 36864 ['conv5_block14_1_relu[0][0]']
conv5_block14_concat (Concaten (None, 7, 7, 960) 0 ['conv5_block13_concat[0][0]',
ate) 'conv5_block14_2_conv[0][0]']
conv5_block15_0_bn (BatchNorma (None, 7, 7, 960) 3840 ['conv5_block14_concat[0][0]']
lization)
conv5_block15_0_relu (Activati (None, 7, 7, 960) 0 ['conv5_block15_0_bn[0][0]']
on)
conv5_block15_1_conv (Conv2D) (None, 7, 7, 128) 122880 ['conv5_block15_0_relu[0][0]']
conv5_block15_1_bn (BatchNorma (None, 7, 7, 128) 512 ['conv5_block15_1_conv[0][0]']
lization)
conv5_block15_1_relu (Activati (None, 7, 7, 128) 0 ['conv5_block15_1_bn[0][0]']
on)
conv5_block15_2_conv (Conv2D) (None, 7, 7, 32) 36864 ['conv5_block15_1_relu[0][0]']
conv5_block15_concat (Concaten (None, 7, 7, 992) 0 ['conv5_block14_concat[0][0]',
ate) 'conv5_block15_2_conv[0][0]']
conv5_block16_0_bn (BatchNorma (None, 7, 7, 992) 3968 ['conv5_block15_concat[0][0]']
lization)
conv5_block16_0_relu (Activati (None, 7, 7, 992) 0 ['conv5_block16_0_bn[0][0]']
on)
conv5_block16_1_conv (Conv2D) (None, 7, 7, 128) 126976 ['conv5_block16_0_relu[0][0]']
conv5_block16_1_bn (BatchNorma (None, 7, 7, 128) 512 ['conv5_block16_1_conv[0][0]']
lization)
conv5_block16_1_relu (Activati (None, 7, 7, 128) 0 ['conv5_block16_1_bn[0][0]']
on)
conv5_block16_2_conv (Conv2D) (None, 7, 7, 32) 36864 ['conv5_block16_1_relu[0][0]']
conv5_block16_concat (Concaten (None, 7, 7, 1024) 0 ['conv5_block15_concat[0][0]',
ate) 'conv5_block16_2_conv[0][0]']
bn (BatchNormalization) (None, 7, 7, 1024) 4096 ['conv5_block16_concat[0][0]']
relu (Activation) (None, 7, 7, 1024) 0 ['bn[0][0]']
Flatten_for_hidden_layers (Glo (None, 1024) 0 ['relu[0][0]']
balAveragePooling2D)
Dropout1 (Dropout) (None, 1024) 0 ['Flatten_for_hidden_layers[0][0]
']
Hidden_Layer1 (Dense) (None, 128) 131200 ['Dropout1[0][0]']
Dropout2 (Dropout) (None, 128) 0 ['Hidden_Layer1[0][0]']
Hidden_Layer2 (Dense) (None, 128) 16512 ['Dropout2[0][0]']
Dropout3 (Dropout) (None, 128) 0 ['Hidden_Layer2[0][0]']
Hidden_Layer3 (Dense) (None, 128) 16512 ['Dropout3[0][0]']
output (Dense) (None, 4) 516 ['Hidden_Layer3[0][0]']
==================================================================================================
Total params: 7,202,244
Trainable params: 7,118,596
Non-trainable params: 83,648
__________________________________________________________________________________________________
# plotting the model
plot_model(densenet_121, to_file='densenet_121.png', show_shapes=True, show_layer_names=True)
dot: graph is too large for cairo-renderer bitmaps. Scaling by 0.681184 to fit
tf.autograph.experimental.do_not_convert(func=None)
<function tensorflow.python.autograph.impl.api.do_not_convert(func=None)>
# Size of TRAIN & VALIDATION labels and BATCH SIZE
y_train.shape[0], BATCH_SIZE, y_val.shape[0]
(11138, 32, 1966)
# Calculating train steps
train_steps = y_train.shape[0] // BATCH_SIZE
train_steps
348
# Calculating test steps
valid_steps = y_val.shape[0] // BATCH_SIZE
valid_steps
61
# Training the DenseNet - 121 model having custom top
history2 = densenet_121.fit(X_train, y_train,
epochs=15,
batch_size=BATCH_SIZE,
callbacks=[tensorboard_callback2, reduce_lr2],
steps_per_epoch=train_steps,
validation_steps=valid_steps,
validation_data=[X_val, y_val],
class_weight=cw1_dict,
verbose=1)
Epoch 1/15 348/348 [==============================] - 158s 389ms/step - loss: 1.6506 - categorical_accuracy: 0.3810 - f1_score: 0.3392 - val_loss: 0.7000 - val_categorical_accuracy: 0.8253 - val_f1_score: 0.6663 - lr: 1.0000e-04 Epoch 2/15 348/348 [==============================] - 136s 386ms/step - loss: 0.8223 - categorical_accuracy: 0.7236 - f1_score: 0.6344 - val_loss: 0.2522 - val_categorical_accuracy: 0.9365 - val_f1_score: 0.8723 - lr: 1.0000e-04 Epoch 3/15 348/348 [==============================] - 134s 385ms/step - loss: 0.4886 - categorical_accuracy: 0.8540 - f1_score: 0.7716 - val_loss: 0.2155 - val_categorical_accuracy: 0.9390 - val_f1_score: 0.8833 - lr: 1.0000e-04 Epoch 4/15 348/348 [==============================] - 134s 385ms/step - loss: 0.2863 - categorical_accuracy: 0.9173 - f1_score: 0.8558 - val_loss: 0.1057 - val_categorical_accuracy: 0.9641 - val_f1_score: 0.9249 - lr: 1.0000e-04 Epoch 5/15 348/348 [==============================] - 134s 385ms/step - loss: 0.1647 - categorical_accuracy: 0.9529 - f1_score: 0.9077 - val_loss: 0.1725 - val_categorical_accuracy: 0.9508 - val_f1_score: 0.9287 - lr: 1.0000e-04 Epoch 6/15 348/348 [==============================] - ETA: 0s - loss: 0.1580 - categorical_accuracy: 0.9580 - f1_score: 0.9189 Epoch 6: ReduceLROnPlateau reducing learning rate to 9.999999747378752e-06. 348/348 [==============================] - 134s 385ms/step - loss: 0.1580 - categorical_accuracy: 0.9580 - f1_score: 0.9189 - val_loss: 0.1425 - val_categorical_accuracy: 0.9565 - val_f1_score: 0.9179 - lr: 1.0000e-04 Epoch 7/15 348/348 [==============================] - 134s 385ms/step - loss: 0.0894 - categorical_accuracy: 0.9778 - f1_score: 0.9564 - val_loss: 0.0236 - val_categorical_accuracy: 0.9933 - val_f1_score: 0.9839 - lr: 1.0000e-05 Epoch 8/15 348/348 [==============================] - 134s 385ms/step - loss: 0.0634 - categorical_accuracy: 0.9824 - f1_score: 0.9642 - val_loss: 0.0150 - val_categorical_accuracy: 0.9959 - val_f1_score: 0.9902 - lr: 1.0000e-05 Epoch 9/15 348/348 [==============================] - 134s 385ms/step - loss: 0.0608 - categorical_accuracy: 0.9847 - f1_score: 0.9693 - val_loss: 0.0158 - val_categorical_accuracy: 0.9959 - val_f1_score: 0.9893 - lr: 1.0000e-05 Epoch 10/15 348/348 [==============================] - 134s 384ms/step - loss: 0.0425 - categorical_accuracy: 0.9890 - f1_score: 0.9776 - val_loss: 0.0119 - val_categorical_accuracy: 0.9980 - val_f1_score: 0.9940 - lr: 1.0000e-05 Epoch 11/15 348/348 [==============================] - 134s 385ms/step - loss: 0.0355 - categorical_accuracy: 0.9902 - f1_score: 0.9804 - val_loss: 0.0117 - val_categorical_accuracy: 0.9980 - val_f1_score: 0.9940 - lr: 1.0000e-05 Epoch 12/15 348/348 [==============================] - 134s 385ms/step - loss: 0.0364 - categorical_accuracy: 0.9918 - f1_score: 0.9839 - val_loss: 0.0127 - val_categorical_accuracy: 0.9969 - val_f1_score: 0.9921 - lr: 1.0000e-05 Epoch 13/15 348/348 [==============================] - ETA: 0s - loss: 0.0356 - categorical_accuracy: 0.9912 - f1_score: 0.9823 Epoch 13: ReduceLROnPlateau reducing learning rate to 9.999999747378752e-07. 348/348 [==============================] - 134s 385ms/step - loss: 0.0356 - categorical_accuracy: 0.9912 - f1_score: 0.9823 - val_loss: 0.0129 - val_categorical_accuracy: 0.9974 - val_f1_score: 0.9925 - lr: 1.0000e-05 Epoch 14/15 348/348 [==============================] - 134s 386ms/step - loss: 0.0332 - categorical_accuracy: 0.9921 - f1_score: 0.9851 - val_loss: 0.0119 - val_categorical_accuracy: 0.9974 - val_f1_score: 0.9924 - lr: 1.0000e-06 Epoch 15/15 348/348 [==============================] - 134s 385ms/step - loss: 0.0333 - categorical_accuracy: 0.9925 - f1_score: 0.9845 - val_loss: 0.0105 - val_categorical_accuracy: 0.9980 - val_f1_score: 0.9940 - lr: 1.0000e-06
# Actual TGT classes distribution in validation set
# print("\n:::: Validation Set ====> ACTUAL TGT Classes Distribution ::::\n")
# display(val_tgt_classes_dist)
# Plotting the Final Results on Validation Set
print("\n:::: Validation Set ====> PREDICTION Confusion Matrix ::::\n")
densenet_121_global_tuning_val_results = confusion_matrix_(y_val, X_val, densenet_121)
# Displaying the overall performance results
print("\n:::: Validation Set ====> FINAL Results ::::\n")
display(densenet_121_global_tuning_val_results)
:::: Validation Set ====> PREDICTION Confusion Matrix :::: 62/62 [==============================] - 8s 109ms/step
:::: Validation Set ====> FINAL Results ::::
| Healthy | Multiple_Diseases | Rust | Scab | |
|---|---|---|---|---|
| BINARY Accuracy | 1.0 | 0.9990 | 0.9992 | 0.9990 |
| Precision | 1.0 | 0.9970 | 0.9985 | 0.9980 |
| Recall | 1.0 | 0.9970 | 0.9977 | 0.9980 |
| Macro F1 Score | 1.0 | 0.9889 | 0.9994 | 0.9983 |
| Macro ROC AUC Score | 1.0 | 0.9889 | 0.9992 | 0.9985 |
OBSERVATIONS
# Actual TGT classes distribution in TEST set
# print("\n:::: TEST Set ====> ACTUAL TGT Classes Distribution ::::\n")
# display(val_tgt_classes_dist)
# Plotting the Results on TEST Set
print("\n:::: TEST Set ====> PREDICTION Confusion Matrix ::::\n")
densenet_121_global_tuning_test_results = confusion_matrix_(y_test, X_test, densenet_121)
# Displaying the overall performance results
print("\n:::: TEST Set ====> FINAL Results ::::\n")
display(densenet_121_global_tuning_test_results)
:::: TEST Set ====> PREDICTION Confusion Matrix :::: 12/12 [==============================] - 2s 184ms/step
:::: TEST Set ====> FINAL Results ::::
| Healthy | Multiple_Diseases | Rust | Scab | |
|---|---|---|---|---|
| BINARY Accuracy | 0.9068 | 0.9151 | 0.9352 | 0.9247 |
| Precision | 0.7518 | 0.6810 | 0.8114 | 0.8493 |
| Recall | 1.0000 | 0.9174 | 0.9268 | 0.8493 |
| Macro F1 Score | 0.8945 | 0.6614 | 0.9722 | 0.8669 |
| Macro ROC AUC Score | 0.9351 | 0.6963 | 0.9659 | 0.8405 |
OBSERVATIONS
Rusty, Scab and Healthy images.Multiple diseases class that has the lowest +ve cases, it is making some false positives and negatives.curr_run_logdir2.split("/")[-1]
'run_2022_11_04-10_22_19'
notebook.list()
No known TensorBoard instances running.
%tensorboard --logdir logs

OBSERVATIONS
A3.MobileNet---V1¶# build the MobileNet network
mobile_net_with_no_top_model = tf.keras.applications.mobilenet.MobileNet(include_top=False, weights='imagenet', input_shape=(224,224,3))
# Model summary
mobile_net_with_no_top_model.summary()
Model: "mobilenet_1.00_224"
_________________________________________________________________
Layer (type) Output Shape Param #
=================================================================
input_1 (InputLayer) [(None, 224, 224, 3)] 0
conv1 (Conv2D) (None, 112, 112, 32) 864
conv1_bn (BatchNormalizatio (None, 112, 112, 32) 128
n)
conv1_relu (ReLU) (None, 112, 112, 32) 0
conv_dw_1 (DepthwiseConv2D) (None, 112, 112, 32) 288
conv_dw_1_bn (BatchNormaliz (None, 112, 112, 32) 128
ation)
conv_dw_1_relu (ReLU) (None, 112, 112, 32) 0
conv_pw_1 (Conv2D) (None, 112, 112, 64) 2048
conv_pw_1_bn (BatchNormaliz (None, 112, 112, 64) 256
ation)
conv_pw_1_relu (ReLU) (None, 112, 112, 64) 0
conv_pad_2 (ZeroPadding2D) (None, 113, 113, 64) 0
conv_dw_2 (DepthwiseConv2D) (None, 56, 56, 64) 576
conv_dw_2_bn (BatchNormaliz (None, 56, 56, 64) 256
ation)
conv_dw_2_relu (ReLU) (None, 56, 56, 64) 0
conv_pw_2 (Conv2D) (None, 56, 56, 128) 8192
conv_pw_2_bn (BatchNormaliz (None, 56, 56, 128) 512
ation)
conv_pw_2_relu (ReLU) (None, 56, 56, 128) 0
conv_dw_3 (DepthwiseConv2D) (None, 56, 56, 128) 1152
conv_dw_3_bn (BatchNormaliz (None, 56, 56, 128) 512
ation)
conv_dw_3_relu (ReLU) (None, 56, 56, 128) 0
conv_pw_3 (Conv2D) (None, 56, 56, 128) 16384
conv_pw_3_bn (BatchNormaliz (None, 56, 56, 128) 512
ation)
conv_pw_3_relu (ReLU) (None, 56, 56, 128) 0
conv_pad_4 (ZeroPadding2D) (None, 57, 57, 128) 0
conv_dw_4 (DepthwiseConv2D) (None, 28, 28, 128) 1152
conv_dw_4_bn (BatchNormaliz (None, 28, 28, 128) 512
ation)
conv_dw_4_relu (ReLU) (None, 28, 28, 128) 0
conv_pw_4 (Conv2D) (None, 28, 28, 256) 32768
conv_pw_4_bn (BatchNormaliz (None, 28, 28, 256) 1024
ation)
conv_pw_4_relu (ReLU) (None, 28, 28, 256) 0
conv_dw_5 (DepthwiseConv2D) (None, 28, 28, 256) 2304
conv_dw_5_bn (BatchNormaliz (None, 28, 28, 256) 1024
ation)
conv_dw_5_relu (ReLU) (None, 28, 28, 256) 0
conv_pw_5 (Conv2D) (None, 28, 28, 256) 65536
conv_pw_5_bn (BatchNormaliz (None, 28, 28, 256) 1024
ation)
conv_pw_5_relu (ReLU) (None, 28, 28, 256) 0
conv_pad_6 (ZeroPadding2D) (None, 29, 29, 256) 0
conv_dw_6 (DepthwiseConv2D) (None, 14, 14, 256) 2304
conv_dw_6_bn (BatchNormaliz (None, 14, 14, 256) 1024
ation)
conv_dw_6_relu (ReLU) (None, 14, 14, 256) 0
conv_pw_6 (Conv2D) (None, 14, 14, 512) 131072
conv_pw_6_bn (BatchNormaliz (None, 14, 14, 512) 2048
ation)
conv_pw_6_relu (ReLU) (None, 14, 14, 512) 0
conv_dw_7 (DepthwiseConv2D) (None, 14, 14, 512) 4608
conv_dw_7_bn (BatchNormaliz (None, 14, 14, 512) 2048
ation)
conv_dw_7_relu (ReLU) (None, 14, 14, 512) 0
conv_pw_7 (Conv2D) (None, 14, 14, 512) 262144
conv_pw_7_bn (BatchNormaliz (None, 14, 14, 512) 2048
ation)
conv_pw_7_relu (ReLU) (None, 14, 14, 512) 0
conv_dw_8 (DepthwiseConv2D) (None, 14, 14, 512) 4608
conv_dw_8_bn (BatchNormaliz (None, 14, 14, 512) 2048
ation)
conv_dw_8_relu (ReLU) (None, 14, 14, 512) 0
conv_pw_8 (Conv2D) (None, 14, 14, 512) 262144
conv_pw_8_bn (BatchNormaliz (None, 14, 14, 512) 2048
ation)
conv_pw_8_relu (ReLU) (None, 14, 14, 512) 0
conv_dw_9 (DepthwiseConv2D) (None, 14, 14, 512) 4608
conv_dw_9_bn (BatchNormaliz (None, 14, 14, 512) 2048
ation)
conv_dw_9_relu (ReLU) (None, 14, 14, 512) 0
conv_pw_9 (Conv2D) (None, 14, 14, 512) 262144
conv_pw_9_bn (BatchNormaliz (None, 14, 14, 512) 2048
ation)
conv_pw_9_relu (ReLU) (None, 14, 14, 512) 0
conv_dw_10 (DepthwiseConv2D (None, 14, 14, 512) 4608
)
conv_dw_10_bn (BatchNormali (None, 14, 14, 512) 2048
zation)
conv_dw_10_relu (ReLU) (None, 14, 14, 512) 0
conv_pw_10 (Conv2D) (None, 14, 14, 512) 262144
conv_pw_10_bn (BatchNormali (None, 14, 14, 512) 2048
zation)
conv_pw_10_relu (ReLU) (None, 14, 14, 512) 0
conv_dw_11 (DepthwiseConv2D (None, 14, 14, 512) 4608
)
conv_dw_11_bn (BatchNormali (None, 14, 14, 512) 2048
zation)
conv_dw_11_relu (ReLU) (None, 14, 14, 512) 0
conv_pw_11 (Conv2D) (None, 14, 14, 512) 262144
conv_pw_11_bn (BatchNormali (None, 14, 14, 512) 2048
zation)
conv_pw_11_relu (ReLU) (None, 14, 14, 512) 0
conv_pad_12 (ZeroPadding2D) (None, 15, 15, 512) 0
conv_dw_12 (DepthwiseConv2D (None, 7, 7, 512) 4608
)
conv_dw_12_bn (BatchNormali (None, 7, 7, 512) 2048
zation)
conv_dw_12_relu (ReLU) (None, 7, 7, 512) 0
conv_pw_12 (Conv2D) (None, 7, 7, 1024) 524288
conv_pw_12_bn (BatchNormali (None, 7, 7, 1024) 4096
zation)
conv_pw_12_relu (ReLU) (None, 7, 7, 1024) 0
conv_dw_13 (DepthwiseConv2D (None, 7, 7, 1024) 9216
)
conv_dw_13_bn (BatchNormali (None, 7, 7, 1024) 4096
zation)
conv_dw_13_relu (ReLU) (None, 7, 7, 1024) 0
conv_pw_13 (Conv2D) (None, 7, 7, 1024) 1048576
conv_pw_13_bn (BatchNormali (None, 7, 7, 1024) 4096
zation)
conv_pw_13_relu (ReLU) (None, 7, 7, 1024) 0
=================================================================
Total params: 3,228,864
Trainable params: 3,206,976
Non-trainable params: 21,888
_________________________________________________________________
# Instantiating Optimizer
learning_rate= 0.0001
opt3 = tf.keras.optimizers.Adam(learning_rate=learning_rate)
# Reduce Learning Rate on Plateau
reduce_lr3 = tf.keras.callbacks.ReduceLROnPlateau(monitor='val_loss', factor=0.1, patience=2, verbose=1, mode='auto', min_delta=0.0001)
# Logs directory
curr_run_logdir3 = get_run_logdir()
# Instantiating Tensorboard callback
tensorboard_callback3 = TensorBoard(log_dir=curr_run_logdir3, histogram_freq=1)
# Setting the seed
os.environ['PYTHONHASHSEED'] = '0'
# Clearing the TF session
tf.keras.backend.clear_session()
# defining the custom top of the MobileNet model
for layer in mobile_net_with_no_top_model.layers:
layer.trainable = True
# Adding additional layers
input_layer = mobile_net_with_no_top_model.output
# Defining the top layers structure of the model
flatten = tf.keras.layers.GlobalAveragePooling2D(name='Flatten_for_hidden_layers')(input_layer)
dropout_1 = Dropout(rate=0.5, name='Dropout1')(flatten)
dense_layer1 = tf.keras.layers.Dense(units=128,
activation='relu',
use_bias=True,
kernel_initializer=tf.keras.initializers.he_normal(seed=80),
bias_initializer=tf.keras.initializers.he_normal(seed=110),
name='Hidden_Layer1')(dropout_1)
dropout_2 = Dropout(rate=0.5, name='Dropout2')(dense_layer1)
dense_layer2 = tf.keras.layers.Dense(units=128,
activation='relu',
use_bias=True,
kernel_initializer=tf.keras.initializers.he_normal(seed=80),
bias_initializer=tf.keras.initializers.he_normal(seed=110),
name='Hidden_Layer2')(dropout_2)
dropout_3 = Dropout(rate=0.5, name='Dropout3')(dense_layer2)
dense_layer3 = tf.keras.layers.Dense(units=128,
activation='relu',
use_bias=True,
kernel_initializer=tf.keras.initializers.he_normal(seed=80),
bias_initializer=tf.keras.initializers.he_normal(seed=110),
name='Hidden_Layer3')(dropout_3)
output_layer = tf.keras.layers.Dense(4, activation='softmax', name="output")(dense_layer3)
# Instantiating the complete model
mobile_net = Model(inputs=mobile_net_with_no_top_model.input, outputs=output_layer)
# Compiling the model
mobile_net.compile(optimizer=opt3,
loss = 'categorical_crossentropy',
metrics=['categorical_accuracy', tfa_f1_scr])
# Summary of the MobileNet model with custom top
mobile_net.summary()
Model: "model"
_________________________________________________________________
Layer (type) Output Shape Param #
=================================================================
input_1 (InputLayer) [(None, 224, 224, 3)] 0
conv1 (Conv2D) (None, 112, 112, 32) 864
conv1_bn (BatchNormalizatio (None, 112, 112, 32) 128
n)
conv1_relu (ReLU) (None, 112, 112, 32) 0
conv_dw_1 (DepthwiseConv2D) (None, 112, 112, 32) 288
conv_dw_1_bn (BatchNormaliz (None, 112, 112, 32) 128
ation)
conv_dw_1_relu (ReLU) (None, 112, 112, 32) 0
conv_pw_1 (Conv2D) (None, 112, 112, 64) 2048
conv_pw_1_bn (BatchNormaliz (None, 112, 112, 64) 256
ation)
conv_pw_1_relu (ReLU) (None, 112, 112, 64) 0
conv_pad_2 (ZeroPadding2D) (None, 113, 113, 64) 0
conv_dw_2 (DepthwiseConv2D) (None, 56, 56, 64) 576
conv_dw_2_bn (BatchNormaliz (None, 56, 56, 64) 256
ation)
conv_dw_2_relu (ReLU) (None, 56, 56, 64) 0
conv_pw_2 (Conv2D) (None, 56, 56, 128) 8192
conv_pw_2_bn (BatchNormaliz (None, 56, 56, 128) 512
ation)
conv_pw_2_relu (ReLU) (None, 56, 56, 128) 0
conv_dw_3 (DepthwiseConv2D) (None, 56, 56, 128) 1152
conv_dw_3_bn (BatchNormaliz (None, 56, 56, 128) 512
ation)
conv_dw_3_relu (ReLU) (None, 56, 56, 128) 0
conv_pw_3 (Conv2D) (None, 56, 56, 128) 16384
conv_pw_3_bn (BatchNormaliz (None, 56, 56, 128) 512
ation)
conv_pw_3_relu (ReLU) (None, 56, 56, 128) 0
conv_pad_4 (ZeroPadding2D) (None, 57, 57, 128) 0
conv_dw_4 (DepthwiseConv2D) (None, 28, 28, 128) 1152
conv_dw_4_bn (BatchNormaliz (None, 28, 28, 128) 512
ation)
conv_dw_4_relu (ReLU) (None, 28, 28, 128) 0
conv_pw_4 (Conv2D) (None, 28, 28, 256) 32768
conv_pw_4_bn (BatchNormaliz (None, 28, 28, 256) 1024
ation)
conv_pw_4_relu (ReLU) (None, 28, 28, 256) 0
conv_dw_5 (DepthwiseConv2D) (None, 28, 28, 256) 2304
conv_dw_5_bn (BatchNormaliz (None, 28, 28, 256) 1024
ation)
conv_dw_5_relu (ReLU) (None, 28, 28, 256) 0
conv_pw_5 (Conv2D) (None, 28, 28, 256) 65536
conv_pw_5_bn (BatchNormaliz (None, 28, 28, 256) 1024
ation)
conv_pw_5_relu (ReLU) (None, 28, 28, 256) 0
conv_pad_6 (ZeroPadding2D) (None, 29, 29, 256) 0
conv_dw_6 (DepthwiseConv2D) (None, 14, 14, 256) 2304
conv_dw_6_bn (BatchNormaliz (None, 14, 14, 256) 1024
ation)
conv_dw_6_relu (ReLU) (None, 14, 14, 256) 0
conv_pw_6 (Conv2D) (None, 14, 14, 512) 131072
conv_pw_6_bn (BatchNormaliz (None, 14, 14, 512) 2048
ation)
conv_pw_6_relu (ReLU) (None, 14, 14, 512) 0
conv_dw_7 (DepthwiseConv2D) (None, 14, 14, 512) 4608
conv_dw_7_bn (BatchNormaliz (None, 14, 14, 512) 2048
ation)
conv_dw_7_relu (ReLU) (None, 14, 14, 512) 0
conv_pw_7 (Conv2D) (None, 14, 14, 512) 262144
conv_pw_7_bn (BatchNormaliz (None, 14, 14, 512) 2048
ation)
conv_pw_7_relu (ReLU) (None, 14, 14, 512) 0
conv_dw_8 (DepthwiseConv2D) (None, 14, 14, 512) 4608
conv_dw_8_bn (BatchNormaliz (None, 14, 14, 512) 2048
ation)
conv_dw_8_relu (ReLU) (None, 14, 14, 512) 0
conv_pw_8 (Conv2D) (None, 14, 14, 512) 262144
conv_pw_8_bn (BatchNormaliz (None, 14, 14, 512) 2048
ation)
conv_pw_8_relu (ReLU) (None, 14, 14, 512) 0
conv_dw_9 (DepthwiseConv2D) (None, 14, 14, 512) 4608
conv_dw_9_bn (BatchNormaliz (None, 14, 14, 512) 2048
ation)
conv_dw_9_relu (ReLU) (None, 14, 14, 512) 0
conv_pw_9 (Conv2D) (None, 14, 14, 512) 262144
conv_pw_9_bn (BatchNormaliz (None, 14, 14, 512) 2048
ation)
conv_pw_9_relu (ReLU) (None, 14, 14, 512) 0
conv_dw_10 (DepthwiseConv2D (None, 14, 14, 512) 4608
)
conv_dw_10_bn (BatchNormali (None, 14, 14, 512) 2048
zation)
conv_dw_10_relu (ReLU) (None, 14, 14, 512) 0
conv_pw_10 (Conv2D) (None, 14, 14, 512) 262144
conv_pw_10_bn (BatchNormali (None, 14, 14, 512) 2048
zation)
conv_pw_10_relu (ReLU) (None, 14, 14, 512) 0
conv_dw_11 (DepthwiseConv2D (None, 14, 14, 512) 4608
)
conv_dw_11_bn (BatchNormali (None, 14, 14, 512) 2048
zation)
conv_dw_11_relu (ReLU) (None, 14, 14, 512) 0
conv_pw_11 (Conv2D) (None, 14, 14, 512) 262144
conv_pw_11_bn (BatchNormali (None, 14, 14, 512) 2048
zation)
conv_pw_11_relu (ReLU) (None, 14, 14, 512) 0
conv_pad_12 (ZeroPadding2D) (None, 15, 15, 512) 0
conv_dw_12 (DepthwiseConv2D (None, 7, 7, 512) 4608
)
conv_dw_12_bn (BatchNormali (None, 7, 7, 512) 2048
zation)
conv_dw_12_relu (ReLU) (None, 7, 7, 512) 0
conv_pw_12 (Conv2D) (None, 7, 7, 1024) 524288
conv_pw_12_bn (BatchNormali (None, 7, 7, 1024) 4096
zation)
conv_pw_12_relu (ReLU) (None, 7, 7, 1024) 0
conv_dw_13 (DepthwiseConv2D (None, 7, 7, 1024) 9216
)
conv_dw_13_bn (BatchNormali (None, 7, 7, 1024) 4096
zation)
conv_dw_13_relu (ReLU) (None, 7, 7, 1024) 0
conv_pw_13 (Conv2D) (None, 7, 7, 1024) 1048576
conv_pw_13_bn (BatchNormali (None, 7, 7, 1024) 4096
zation)
conv_pw_13_relu (ReLU) (None, 7, 7, 1024) 0
Flatten_for_hidden_layers ( (None, 1024) 0
GlobalAveragePooling2D)
Dropout1 (Dropout) (None, 1024) 0
Hidden_Layer1 (Dense) (None, 128) 131200
Dropout2 (Dropout) (None, 128) 0
Hidden_Layer2 (Dense) (None, 128) 16512
Dropout3 (Dropout) (None, 128) 0
Hidden_Layer3 (Dense) (None, 128) 16512
output (Dense) (None, 4) 516
=================================================================
Total params: 3,393,604
Trainable params: 3,371,716
Non-trainable params: 21,888
_________________________________________________________________
# plotting the model
plot_model(mobile_net, to_file='mobile_net.png', show_shapes=True, show_layer_names=True)
tf.autograph.experimental.do_not_convert(func=None)
<function tensorflow.python.autograph.impl.api.do_not_convert(func=None)>
# Size of TRAIN & VALIDATION labels and BATCH SIZE
y_train.shape[0], BATCH_SIZE, y_val.shape[0]
(11138, 32, 1966)
# Calculating train steps
train_steps = y_train.shape[0] // BATCH_SIZE
train_steps
348
# Calculating test steps
valid_steps = y_val.shape[0] // BATCH_SIZE
valid_steps
61
cw1_dict
{0: 0.8862189688096753,
1: 4.954626334519573,
2: 0.7281642259414226,
3: 0.7713296398891967}
# Training the MobileNet model having custom top
history3 = mobile_net.fit(X_train, y_train,
epochs=16,
batch_size=BATCH_SIZE,
callbacks=[tensorboard_callback3, reduce_lr3],
steps_per_epoch=train_steps,
validation_steps=valid_steps,
validation_data=[X_val, y_val],
class_weight=cw1_dict,
verbose=1)
Epoch 1/16 348/348 [==============================] - 81s 203ms/step - loss: 1.9490 - categorical_accuracy: 0.3330 - f1_score: 0.2944 - val_loss: 1.0803 - val_categorical_accuracy: 0.5502 - val_f1_score: 0.4507 - lr: 1.0000e-04 Epoch 2/16 348/348 [==============================] - 70s 199ms/step - loss: 1.1952 - categorical_accuracy: 0.5372 - f1_score: 0.4743 - val_loss: 0.6080 - val_categorical_accuracy: 0.8048 - val_f1_score: 0.6937 - lr: 1.0000e-04 Epoch 3/16 348/348 [==============================] - 70s 201ms/step - loss: 0.8493 - categorical_accuracy: 0.7086 - f1_score: 0.6305 - val_loss: 0.3549 - val_categorical_accuracy: 0.8719 - val_f1_score: 0.7952 - lr: 1.0000e-04 Epoch 4/16 348/348 [==============================] - 70s 201ms/step - loss: 0.5761 - categorical_accuracy: 0.8078 - f1_score: 0.7301 - val_loss: 0.2440 - val_categorical_accuracy: 0.9196 - val_f1_score: 0.8700 - lr: 1.0000e-04 Epoch 5/16 348/348 [==============================] - 70s 201ms/step - loss: 0.3983 - categorical_accuracy: 0.8689 - f1_score: 0.8022 - val_loss: 0.1935 - val_categorical_accuracy: 0.9293 - val_f1_score: 0.8760 - lr: 1.0000e-04 Epoch 6/16 348/348 [==============================] - 70s 201ms/step - loss: 0.2801 - categorical_accuracy: 0.9121 - f1_score: 0.8573 - val_loss: 0.1206 - val_categorical_accuracy: 0.9616 - val_f1_score: 0.9307 - lr: 1.0000e-04 Epoch 7/16 348/348 [==============================] - 70s 201ms/step - loss: 0.1838 - categorical_accuracy: 0.9413 - f1_score: 0.9038 - val_loss: 0.1045 - val_categorical_accuracy: 0.9621 - val_f1_score: 0.9258 - lr: 1.0000e-04 Epoch 8/16 348/348 [==============================] - 70s 202ms/step - loss: 0.1308 - categorical_accuracy: 0.9617 - f1_score: 0.9360 - val_loss: 0.0529 - val_categorical_accuracy: 0.9851 - val_f1_score: 0.9698 - lr: 1.0000e-04 Epoch 9/16 348/348 [==============================] - 70s 200ms/step - loss: 0.0940 - categorical_accuracy: 0.9727 - f1_score: 0.9502 - val_loss: 0.0821 - val_categorical_accuracy: 0.9810 - val_f1_score: 0.9565 - lr: 1.0000e-04 Epoch 10/16 348/348 [==============================] - ETA: 0s - loss: 0.0868 - categorical_accuracy: 0.9762 - f1_score: 0.9575 Epoch 10: ReduceLROnPlateau reducing learning rate to 9.999999747378752e-06. 348/348 [==============================] - 70s 200ms/step - loss: 0.0868 - categorical_accuracy: 0.9762 - f1_score: 0.9575 - val_loss: 0.0612 - val_categorical_accuracy: 0.9790 - val_f1_score: 0.9613 - lr: 1.0000e-04 Epoch 11/16 348/348 [==============================] - 70s 202ms/step - loss: 0.0597 - categorical_accuracy: 0.9839 - f1_score: 0.9711 - val_loss: 0.0249 - val_categorical_accuracy: 0.9918 - val_f1_score: 0.9827 - lr: 1.0000e-05 Epoch 12/16 348/348 [==============================] - 70s 202ms/step - loss: 0.0489 - categorical_accuracy: 0.9868 - f1_score: 0.9758 - val_loss: 0.0215 - val_categorical_accuracy: 0.9928 - val_f1_score: 0.9855 - lr: 1.0000e-05 Epoch 13/16 348/348 [==============================] - 70s 201ms/step - loss: 0.0415 - categorical_accuracy: 0.9878 - f1_score: 0.9787 - val_loss: 0.0180 - val_categorical_accuracy: 0.9939 - val_f1_score: 0.9863 - lr: 1.0000e-05 Epoch 14/16 348/348 [==============================] - 70s 202ms/step - loss: 0.0398 - categorical_accuracy: 0.9870 - f1_score: 0.9776 - val_loss: 0.0178 - val_categorical_accuracy: 0.9954 - val_f1_score: 0.9897 - lr: 1.0000e-05 Epoch 15/16 348/348 [==============================] - 70s 202ms/step - loss: 0.0295 - categorical_accuracy: 0.9921 - f1_score: 0.9859 - val_loss: 0.0198 - val_categorical_accuracy: 0.9954 - val_f1_score: 0.9907 - lr: 1.0000e-05 Epoch 16/16 348/348 [==============================] - ETA: 0s - loss: 0.0328 - categorical_accuracy: 0.9900 - f1_score: 0.9850 Epoch 16: ReduceLROnPlateau reducing learning rate to 9.999999747378752e-07. 348/348 [==============================] - 69s 200ms/step - loss: 0.0328 - categorical_accuracy: 0.9900 - f1_score: 0.9850 - val_loss: 0.0199 - val_categorical_accuracy: 0.9959 - val_f1_score: 0.9911 - lr: 1.0000e-05
# Actual TGT classes distribution in validation set
# print("\n:::: Validation Set ====> ACTUAL TGT Classes Distribution ::::\n")
# display(val_tgt_classes_dist)
# Plotting the Final Results on Validation Set
print("\n:::: Validation Set ====> PREDICTION Confusion Matrix ::::\n")
mobile_net_global_tuning_val_results = confusion_matrix_(y_val, X_val, mobile_net)
# Displaying the overall performance results
print("\n:::: Validation Set ====> FINAL Results ::::\n")
display(mobile_net_global_tuning_val_results)
:::: Validation Set ====> PREDICTION Confusion Matrix :::: 62/62 [==============================] - 3s 40ms/step
:::: Validation Set ====> FINAL Results ::::
| Healthy | Multiple_Diseases | Rust | Scab | |
|---|---|---|---|---|
| BINARY Accuracy | 0.9980 | 0.9977 | 0.9983 | 0.9980 |
| Precision | 0.9931 | 0.9911 | 0.9955 | 0.9959 |
| Recall | 1.0000 | 0.9955 | 0.9970 | 0.9959 |
| Macro F1 Score | 0.9975 | 0.9861 | 0.9994 | 0.9965 |
| Macro ROC AUC Score | 0.9986 | 0.9837 | 0.9992 | 0.9962 |
OBSERVATIONS
# Actual TGT classes distribution in TEST set
# print("\n:::: TEST Set ====> ACTUAL TGT Classes Distribution ::::\n")
# display(val_tgt_classes_dist)
# Plotting the Results on TEST Set
print("\n:::: TEST Set ====> PREDICTION Confusion Matrix ::::\n")
mobile_net_global_tuning_test_results = confusion_matrix_(y_test, X_test, mobile_net)
# Displaying the overall performance results
print("\n:::: TEST Set ====> FINAL Results ::::\n")
display(mobile_net_global_tuning_test_results)
:::: TEST Set ====> PREDICTION Confusion Matrix :::: 12/12 [==============================] - 1s 61ms/step
:::: TEST Set ====> FINAL Results ::::
| Healthy | Multiple_Diseases | Rust | Scab | |
|---|---|---|---|---|
| BINARY Accuracy | 0.8438 | 0.8904 | 0.9169 | 0.9055 |
| Precision | 0.6438 | 0.6199 | 0.7719 | 0.8110 |
| Recall | 1.0000 | 0.8760 | 0.8943 | 0.8110 |
| Macro F1 Score | 0.8306 | 0.5870 | 0.9658 | 0.8377 |
| Macro ROC AUC Score | 0.8912 | 0.5718 | 0.9560 | 0.8112 |
OBSERVATIONS
Rusty, Scab and Healthy images.Multiple diseases class that has the lowest +ve cases, it is making some false positives and negatives. And, not competent enough for identifying the multiple diseased imagescurr_run_logdir3.split("/")[-1]
'run_2022_11_04-11_05_14'
notebook.list()
No known TensorBoard instances running.
%tensorboard --logdir logs

OBSERVATIONS
A4.MobileNet---V3---Small¶# build the MobileNet V3 Small network
mobilenet_v3_small_with_no_top_model = tf.keras.applications.MobileNetV3Small(include_top=False, weights='imagenet', input_shape=(224,224,3))
# Model summary
mobilenet_v3_small_with_no_top_model.summary()
Model: "MobilenetV3small"
__________________________________________________________________________________________________
Layer (type) Output Shape Param # Connected to
==================================================================================================
input_1 (InputLayer) [(None, 224, 224, 3 0 []
)]
rescaling (Rescaling) (None, 224, 224, 3) 0 ['input_1[0][0]']
Conv (Conv2D) (None, 112, 112, 16 432 ['rescaling[0][0]']
)
Conv/BatchNorm (BatchNormaliza (None, 112, 112, 16 64 ['Conv[0][0]']
tion) )
tf.__operators__.add (TFOpLamb (None, 112, 112, 16 0 ['Conv/BatchNorm[0][0]']
da) )
re_lu (ReLU) (None, 112, 112, 16 0 ['tf.__operators__.add[0][0]']
)
tf.math.multiply (TFOpLambda) (None, 112, 112, 16 0 ['re_lu[0][0]']
)
multiply (Multiply) (None, 112, 112, 16 0 ['Conv/BatchNorm[0][0]',
) 'tf.math.multiply[0][0]']
expanded_conv/depthwise/pad (Z (None, 113, 113, 16 0 ['multiply[0][0]']
eroPadding2D) )
expanded_conv/depthwise (Depth (None, 56, 56, 16) 144 ['expanded_conv/depthwise/pad[0][
wiseConv2D) 0]']
expanded_conv/depthwise/BatchN (None, 56, 56, 16) 64 ['expanded_conv/depthwise[0][0]']
orm (BatchNormalization)
re_lu_1 (ReLU) (None, 56, 56, 16) 0 ['expanded_conv/depthwise/BatchNo
rm[0][0]']
expanded_conv/squeeze_excite/A (None, 1, 1, 16) 0 ['re_lu_1[0][0]']
vgPool (GlobalAveragePooling2D
)
expanded_conv/squeeze_excite/C (None, 1, 1, 8) 136 ['expanded_conv/squeeze_excite/Av
onv (Conv2D) gPool[0][0]']
expanded_conv/squeeze_excite/R (None, 1, 1, 8) 0 ['expanded_conv/squeeze_excite/Co
elu (ReLU) nv[0][0]']
expanded_conv/squeeze_excite/C (None, 1, 1, 16) 144 ['expanded_conv/squeeze_excite/Re
onv_1 (Conv2D) lu[0][0]']
tf.__operators__.add_1 (TFOpLa (None, 1, 1, 16) 0 ['expanded_conv/squeeze_excite/Co
mbda) nv_1[0][0]']
re_lu_2 (ReLU) (None, 1, 1, 16) 0 ['tf.__operators__.add_1[0][0]']
tf.math.multiply_1 (TFOpLambda (None, 1, 1, 16) 0 ['re_lu_2[0][0]']
)
expanded_conv/squeeze_excite/M (None, 56, 56, 16) 0 ['re_lu_1[0][0]',
ul (Multiply) 'tf.math.multiply_1[0][0]']
expanded_conv/project (Conv2D) (None, 56, 56, 16) 256 ['expanded_conv/squeeze_excite/Mu
l[0][0]']
expanded_conv/project/BatchNor (None, 56, 56, 16) 64 ['expanded_conv/project[0][0]']
m (BatchNormalization)
expanded_conv_1/expand (Conv2D (None, 56, 56, 72) 1152 ['expanded_conv/project/BatchNorm
) [0][0]']
expanded_conv_1/expand/BatchNo (None, 56, 56, 72) 288 ['expanded_conv_1/expand[0][0]']
rm (BatchNormalization)
re_lu_3 (ReLU) (None, 56, 56, 72) 0 ['expanded_conv_1/expand/BatchNor
m[0][0]']
expanded_conv_1/depthwise/pad (None, 57, 57, 72) 0 ['re_lu_3[0][0]']
(ZeroPadding2D)
expanded_conv_1/depthwise (Dep (None, 28, 28, 72) 648 ['expanded_conv_1/depthwise/pad[0
thwiseConv2D) ][0]']
expanded_conv_1/depthwise/Batc (None, 28, 28, 72) 288 ['expanded_conv_1/depthwise[0][0]
hNorm (BatchNormalization) ']
re_lu_4 (ReLU) (None, 28, 28, 72) 0 ['expanded_conv_1/depthwise/Batch
Norm[0][0]']
expanded_conv_1/project (Conv2 (None, 28, 28, 24) 1728 ['re_lu_4[0][0]']
D)
expanded_conv_1/project/BatchN (None, 28, 28, 24) 96 ['expanded_conv_1/project[0][0]']
orm (BatchNormalization)
expanded_conv_2/expand (Conv2D (None, 28, 28, 88) 2112 ['expanded_conv_1/project/BatchNo
) rm[0][0]']
expanded_conv_2/expand/BatchNo (None, 28, 28, 88) 352 ['expanded_conv_2/expand[0][0]']
rm (BatchNormalization)
re_lu_5 (ReLU) (None, 28, 28, 88) 0 ['expanded_conv_2/expand/BatchNor
m[0][0]']
expanded_conv_2/depthwise (Dep (None, 28, 28, 88) 792 ['re_lu_5[0][0]']
thwiseConv2D)
expanded_conv_2/depthwise/Batc (None, 28, 28, 88) 352 ['expanded_conv_2/depthwise[0][0]
hNorm (BatchNormalization) ']
re_lu_6 (ReLU) (None, 28, 28, 88) 0 ['expanded_conv_2/depthwise/Batch
Norm[0][0]']
expanded_conv_2/project (Conv2 (None, 28, 28, 24) 2112 ['re_lu_6[0][0]']
D)
expanded_conv_2/project/BatchN (None, 28, 28, 24) 96 ['expanded_conv_2/project[0][0]']
orm (BatchNormalization)
expanded_conv_2/Add (Add) (None, 28, 28, 24) 0 ['expanded_conv_1/project/BatchNo
rm[0][0]',
'expanded_conv_2/project/BatchNo
rm[0][0]']
expanded_conv_3/expand (Conv2D (None, 28, 28, 96) 2304 ['expanded_conv_2/Add[0][0]']
)
expanded_conv_3/expand/BatchNo (None, 28, 28, 96) 384 ['expanded_conv_3/expand[0][0]']
rm (BatchNormalization)
tf.__operators__.add_2 (TFOpLa (None, 28, 28, 96) 0 ['expanded_conv_3/expand/BatchNor
mbda) m[0][0]']
re_lu_7 (ReLU) (None, 28, 28, 96) 0 ['tf.__operators__.add_2[0][0]']
tf.math.multiply_2 (TFOpLambda (None, 28, 28, 96) 0 ['re_lu_7[0][0]']
)
multiply_1 (Multiply) (None, 28, 28, 96) 0 ['expanded_conv_3/expand/BatchNor
m[0][0]',
'tf.math.multiply_2[0][0]']
expanded_conv_3/depthwise/pad (None, 31, 31, 96) 0 ['multiply_1[0][0]']
(ZeroPadding2D)
expanded_conv_3/depthwise (Dep (None, 14, 14, 96) 2400 ['expanded_conv_3/depthwise/pad[0
thwiseConv2D) ][0]']
expanded_conv_3/depthwise/Batc (None, 14, 14, 96) 384 ['expanded_conv_3/depthwise[0][0]
hNorm (BatchNormalization) ']
tf.__operators__.add_3 (TFOpLa (None, 14, 14, 96) 0 ['expanded_conv_3/depthwise/Batch
mbda) Norm[0][0]']
re_lu_8 (ReLU) (None, 14, 14, 96) 0 ['tf.__operators__.add_3[0][0]']
tf.math.multiply_3 (TFOpLambda (None, 14, 14, 96) 0 ['re_lu_8[0][0]']
)
multiply_2 (Multiply) (None, 14, 14, 96) 0 ['expanded_conv_3/depthwise/Batch
Norm[0][0]',
'tf.math.multiply_3[0][0]']
expanded_conv_3/squeeze_excite (None, 1, 1, 96) 0 ['multiply_2[0][0]']
/AvgPool (GlobalAveragePooling
2D)
expanded_conv_3/squeeze_excite (None, 1, 1, 24) 2328 ['expanded_conv_3/squeeze_excite/
/Conv (Conv2D) AvgPool[0][0]']
expanded_conv_3/squeeze_excite (None, 1, 1, 24) 0 ['expanded_conv_3/squeeze_excite/
/Relu (ReLU) Conv[0][0]']
expanded_conv_3/squeeze_excite (None, 1, 1, 96) 2400 ['expanded_conv_3/squeeze_excite/
/Conv_1 (Conv2D) Relu[0][0]']
tf.__operators__.add_4 (TFOpLa (None, 1, 1, 96) 0 ['expanded_conv_3/squeeze_excite/
mbda) Conv_1[0][0]']
re_lu_9 (ReLU) (None, 1, 1, 96) 0 ['tf.__operators__.add_4[0][0]']
tf.math.multiply_4 (TFOpLambda (None, 1, 1, 96) 0 ['re_lu_9[0][0]']
)
expanded_conv_3/squeeze_excite (None, 14, 14, 96) 0 ['multiply_2[0][0]',
/Mul (Multiply) 'tf.math.multiply_4[0][0]']
expanded_conv_3/project (Conv2 (None, 14, 14, 40) 3840 ['expanded_conv_3/squeeze_excite/
D) Mul[0][0]']
expanded_conv_3/project/BatchN (None, 14, 14, 40) 160 ['expanded_conv_3/project[0][0]']
orm (BatchNormalization)
expanded_conv_4/expand (Conv2D (None, 14, 14, 240) 9600 ['expanded_conv_3/project/BatchNo
) rm[0][0]']
expanded_conv_4/expand/BatchNo (None, 14, 14, 240) 960 ['expanded_conv_4/expand[0][0]']
rm (BatchNormalization)
tf.__operators__.add_5 (TFOpLa (None, 14, 14, 240) 0 ['expanded_conv_4/expand/BatchNor
mbda) m[0][0]']
re_lu_10 (ReLU) (None, 14, 14, 240) 0 ['tf.__operators__.add_5[0][0]']
tf.math.multiply_5 (TFOpLambda (None, 14, 14, 240) 0 ['re_lu_10[0][0]']
)
multiply_3 (Multiply) (None, 14, 14, 240) 0 ['expanded_conv_4/expand/BatchNor
m[0][0]',
'tf.math.multiply_5[0][0]']
expanded_conv_4/depthwise (Dep (None, 14, 14, 240) 6000 ['multiply_3[0][0]']
thwiseConv2D)
expanded_conv_4/depthwise/Batc (None, 14, 14, 240) 960 ['expanded_conv_4/depthwise[0][0]
hNorm (BatchNormalization) ']
tf.__operators__.add_6 (TFOpLa (None, 14, 14, 240) 0 ['expanded_conv_4/depthwise/Batch
mbda) Norm[0][0]']
re_lu_11 (ReLU) (None, 14, 14, 240) 0 ['tf.__operators__.add_6[0][0]']
tf.math.multiply_6 (TFOpLambda (None, 14, 14, 240) 0 ['re_lu_11[0][0]']
)
multiply_4 (Multiply) (None, 14, 14, 240) 0 ['expanded_conv_4/depthwise/Batch
Norm[0][0]',
'tf.math.multiply_6[0][0]']
expanded_conv_4/squeeze_excite (None, 1, 1, 240) 0 ['multiply_4[0][0]']
/AvgPool (GlobalAveragePooling
2D)
expanded_conv_4/squeeze_excite (None, 1, 1, 64) 15424 ['expanded_conv_4/squeeze_excite/
/Conv (Conv2D) AvgPool[0][0]']
expanded_conv_4/squeeze_excite (None, 1, 1, 64) 0 ['expanded_conv_4/squeeze_excite/
/Relu (ReLU) Conv[0][0]']
expanded_conv_4/squeeze_excite (None, 1, 1, 240) 15600 ['expanded_conv_4/squeeze_excite/
/Conv_1 (Conv2D) Relu[0][0]']
tf.__operators__.add_7 (TFOpLa (None, 1, 1, 240) 0 ['expanded_conv_4/squeeze_excite/
mbda) Conv_1[0][0]']
re_lu_12 (ReLU) (None, 1, 1, 240) 0 ['tf.__operators__.add_7[0][0]']
tf.math.multiply_7 (TFOpLambda (None, 1, 1, 240) 0 ['re_lu_12[0][0]']
)
expanded_conv_4/squeeze_excite (None, 14, 14, 240) 0 ['multiply_4[0][0]',
/Mul (Multiply) 'tf.math.multiply_7[0][0]']
expanded_conv_4/project (Conv2 (None, 14, 14, 40) 9600 ['expanded_conv_4/squeeze_excite/
D) Mul[0][0]']
expanded_conv_4/project/BatchN (None, 14, 14, 40) 160 ['expanded_conv_4/project[0][0]']
orm (BatchNormalization)
expanded_conv_4/Add (Add) (None, 14, 14, 40) 0 ['expanded_conv_3/project/BatchNo
rm[0][0]',
'expanded_conv_4/project/BatchNo
rm[0][0]']
expanded_conv_5/expand (Conv2D (None, 14, 14, 240) 9600 ['expanded_conv_4/Add[0][0]']
)
expanded_conv_5/expand/BatchNo (None, 14, 14, 240) 960 ['expanded_conv_5/expand[0][0]']
rm (BatchNormalization)
tf.__operators__.add_8 (TFOpLa (None, 14, 14, 240) 0 ['expanded_conv_5/expand/BatchNor
mbda) m[0][0]']
re_lu_13 (ReLU) (None, 14, 14, 240) 0 ['tf.__operators__.add_8[0][0]']
tf.math.multiply_8 (TFOpLambda (None, 14, 14, 240) 0 ['re_lu_13[0][0]']
)
multiply_5 (Multiply) (None, 14, 14, 240) 0 ['expanded_conv_5/expand/BatchNor
m[0][0]',
'tf.math.multiply_8[0][0]']
expanded_conv_5/depthwise (Dep (None, 14, 14, 240) 6000 ['multiply_5[0][0]']
thwiseConv2D)
expanded_conv_5/depthwise/Batc (None, 14, 14, 240) 960 ['expanded_conv_5/depthwise[0][0]
hNorm (BatchNormalization) ']
tf.__operators__.add_9 (TFOpLa (None, 14, 14, 240) 0 ['expanded_conv_5/depthwise/Batch
mbda) Norm[0][0]']
re_lu_14 (ReLU) (None, 14, 14, 240) 0 ['tf.__operators__.add_9[0][0]']
tf.math.multiply_9 (TFOpLambda (None, 14, 14, 240) 0 ['re_lu_14[0][0]']
)
multiply_6 (Multiply) (None, 14, 14, 240) 0 ['expanded_conv_5/depthwise/Batch
Norm[0][0]',
'tf.math.multiply_9[0][0]']
expanded_conv_5/squeeze_excite (None, 1, 1, 240) 0 ['multiply_6[0][0]']
/AvgPool (GlobalAveragePooling
2D)
expanded_conv_5/squeeze_excite (None, 1, 1, 64) 15424 ['expanded_conv_5/squeeze_excite/
/Conv (Conv2D) AvgPool[0][0]']
expanded_conv_5/squeeze_excite (None, 1, 1, 64) 0 ['expanded_conv_5/squeeze_excite/
/Relu (ReLU) Conv[0][0]']
expanded_conv_5/squeeze_excite (None, 1, 1, 240) 15600 ['expanded_conv_5/squeeze_excite/
/Conv_1 (Conv2D) Relu[0][0]']
tf.__operators__.add_10 (TFOpL (None, 1, 1, 240) 0 ['expanded_conv_5/squeeze_excite/
ambda) Conv_1[0][0]']
re_lu_15 (ReLU) (None, 1, 1, 240) 0 ['tf.__operators__.add_10[0][0]']
tf.math.multiply_10 (TFOpLambd (None, 1, 1, 240) 0 ['re_lu_15[0][0]']
a)
expanded_conv_5/squeeze_excite (None, 14, 14, 240) 0 ['multiply_6[0][0]',
/Mul (Multiply) 'tf.math.multiply_10[0][0]']
expanded_conv_5/project (Conv2 (None, 14, 14, 40) 9600 ['expanded_conv_5/squeeze_excite/
D) Mul[0][0]']
expanded_conv_5/project/BatchN (None, 14, 14, 40) 160 ['expanded_conv_5/project[0][0]']
orm (BatchNormalization)
expanded_conv_5/Add (Add) (None, 14, 14, 40) 0 ['expanded_conv_4/Add[0][0]',
'expanded_conv_5/project/BatchNo
rm[0][0]']
expanded_conv_6/expand (Conv2D (None, 14, 14, 120) 4800 ['expanded_conv_5/Add[0][0]']
)
expanded_conv_6/expand/BatchNo (None, 14, 14, 120) 480 ['expanded_conv_6/expand[0][0]']
rm (BatchNormalization)
tf.__operators__.add_11 (TFOpL (None, 14, 14, 120) 0 ['expanded_conv_6/expand/BatchNor
ambda) m[0][0]']
re_lu_16 (ReLU) (None, 14, 14, 120) 0 ['tf.__operators__.add_11[0][0]']
tf.math.multiply_11 (TFOpLambd (None, 14, 14, 120) 0 ['re_lu_16[0][0]']
a)
multiply_7 (Multiply) (None, 14, 14, 120) 0 ['expanded_conv_6/expand/BatchNor
m[0][0]',
'tf.math.multiply_11[0][0]']
expanded_conv_6/depthwise (Dep (None, 14, 14, 120) 3000 ['multiply_7[0][0]']
thwiseConv2D)
expanded_conv_6/depthwise/Batc (None, 14, 14, 120) 480 ['expanded_conv_6/depthwise[0][0]
hNorm (BatchNormalization) ']
tf.__operators__.add_12 (TFOpL (None, 14, 14, 120) 0 ['expanded_conv_6/depthwise/Batch
ambda) Norm[0][0]']
re_lu_17 (ReLU) (None, 14, 14, 120) 0 ['tf.__operators__.add_12[0][0]']
tf.math.multiply_12 (TFOpLambd (None, 14, 14, 120) 0 ['re_lu_17[0][0]']
a)
multiply_8 (Multiply) (None, 14, 14, 120) 0 ['expanded_conv_6/depthwise/Batch
Norm[0][0]',
'tf.math.multiply_12[0][0]']
expanded_conv_6/squeeze_excite (None, 1, 1, 120) 0 ['multiply_8[0][0]']
/AvgPool (GlobalAveragePooling
2D)
expanded_conv_6/squeeze_excite (None, 1, 1, 32) 3872 ['expanded_conv_6/squeeze_excite/
/Conv (Conv2D) AvgPool[0][0]']
expanded_conv_6/squeeze_excite (None, 1, 1, 32) 0 ['expanded_conv_6/squeeze_excite/
/Relu (ReLU) Conv[0][0]']
expanded_conv_6/squeeze_excite (None, 1, 1, 120) 3960 ['expanded_conv_6/squeeze_excite/
/Conv_1 (Conv2D) Relu[0][0]']
tf.__operators__.add_13 (TFOpL (None, 1, 1, 120) 0 ['expanded_conv_6/squeeze_excite/
ambda) Conv_1[0][0]']
re_lu_18 (ReLU) (None, 1, 1, 120) 0 ['tf.__operators__.add_13[0][0]']
tf.math.multiply_13 (TFOpLambd (None, 1, 1, 120) 0 ['re_lu_18[0][0]']
a)
expanded_conv_6/squeeze_excite (None, 14, 14, 120) 0 ['multiply_8[0][0]',
/Mul (Multiply) 'tf.math.multiply_13[0][0]']
expanded_conv_6/project (Conv2 (None, 14, 14, 48) 5760 ['expanded_conv_6/squeeze_excite/
D) Mul[0][0]']
expanded_conv_6/project/BatchN (None, 14, 14, 48) 192 ['expanded_conv_6/project[0][0]']
orm (BatchNormalization)
expanded_conv_7/expand (Conv2D (None, 14, 14, 144) 6912 ['expanded_conv_6/project/BatchNo
) rm[0][0]']
expanded_conv_7/expand/BatchNo (None, 14, 14, 144) 576 ['expanded_conv_7/expand[0][0]']
rm (BatchNormalization)
tf.__operators__.add_14 (TFOpL (None, 14, 14, 144) 0 ['expanded_conv_7/expand/BatchNor
ambda) m[0][0]']
re_lu_19 (ReLU) (None, 14, 14, 144) 0 ['tf.__operators__.add_14[0][0]']
tf.math.multiply_14 (TFOpLambd (None, 14, 14, 144) 0 ['re_lu_19[0][0]']
a)
multiply_9 (Multiply) (None, 14, 14, 144) 0 ['expanded_conv_7/expand/BatchNor
m[0][0]',
'tf.math.multiply_14[0][0]']
expanded_conv_7/depthwise (Dep (None, 14, 14, 144) 3600 ['multiply_9[0][0]']
thwiseConv2D)
expanded_conv_7/depthwise/Batc (None, 14, 14, 144) 576 ['expanded_conv_7/depthwise[0][0]
hNorm (BatchNormalization) ']
tf.__operators__.add_15 (TFOpL (None, 14, 14, 144) 0 ['expanded_conv_7/depthwise/Batch
ambda) Norm[0][0]']
re_lu_20 (ReLU) (None, 14, 14, 144) 0 ['tf.__operators__.add_15[0][0]']
tf.math.multiply_15 (TFOpLambd (None, 14, 14, 144) 0 ['re_lu_20[0][0]']
a)
multiply_10 (Multiply) (None, 14, 14, 144) 0 ['expanded_conv_7/depthwise/Batch
Norm[0][0]',
'tf.math.multiply_15[0][0]']
expanded_conv_7/squeeze_excite (None, 1, 1, 144) 0 ['multiply_10[0][0]']
/AvgPool (GlobalAveragePooling
2D)
expanded_conv_7/squeeze_excite (None, 1, 1, 40) 5800 ['expanded_conv_7/squeeze_excite/
/Conv (Conv2D) AvgPool[0][0]']
expanded_conv_7/squeeze_excite (None, 1, 1, 40) 0 ['expanded_conv_7/squeeze_excite/
/Relu (ReLU) Conv[0][0]']
expanded_conv_7/squeeze_excite (None, 1, 1, 144) 5904 ['expanded_conv_7/squeeze_excite/
/Conv_1 (Conv2D) Relu[0][0]']
tf.__operators__.add_16 (TFOpL (None, 1, 1, 144) 0 ['expanded_conv_7/squeeze_excite/
ambda) Conv_1[0][0]']
re_lu_21 (ReLU) (None, 1, 1, 144) 0 ['tf.__operators__.add_16[0][0]']
tf.math.multiply_16 (TFOpLambd (None, 1, 1, 144) 0 ['re_lu_21[0][0]']
a)
expanded_conv_7/squeeze_excite (None, 14, 14, 144) 0 ['multiply_10[0][0]',
/Mul (Multiply) 'tf.math.multiply_16[0][0]']
expanded_conv_7/project (Conv2 (None, 14, 14, 48) 6912 ['expanded_conv_7/squeeze_excite/
D) Mul[0][0]']
expanded_conv_7/project/BatchN (None, 14, 14, 48) 192 ['expanded_conv_7/project[0][0]']
orm (BatchNormalization)
expanded_conv_7/Add (Add) (None, 14, 14, 48) 0 ['expanded_conv_6/project/BatchNo
rm[0][0]',
'expanded_conv_7/project/BatchNo
rm[0][0]']
expanded_conv_8/expand (Conv2D (None, 14, 14, 288) 13824 ['expanded_conv_7/Add[0][0]']
)
expanded_conv_8/expand/BatchNo (None, 14, 14, 288) 1152 ['expanded_conv_8/expand[0][0]']
rm (BatchNormalization)
tf.__operators__.add_17 (TFOpL (None, 14, 14, 288) 0 ['expanded_conv_8/expand/BatchNor
ambda) m[0][0]']
re_lu_22 (ReLU) (None, 14, 14, 288) 0 ['tf.__operators__.add_17[0][0]']
tf.math.multiply_17 (TFOpLambd (None, 14, 14, 288) 0 ['re_lu_22[0][0]']
a)
multiply_11 (Multiply) (None, 14, 14, 288) 0 ['expanded_conv_8/expand/BatchNor
m[0][0]',
'tf.math.multiply_17[0][0]']
expanded_conv_8/depthwise/pad (None, 17, 17, 288) 0 ['multiply_11[0][0]']
(ZeroPadding2D)
expanded_conv_8/depthwise (Dep (None, 7, 7, 288) 7200 ['expanded_conv_8/depthwise/pad[0
thwiseConv2D) ][0]']
expanded_conv_8/depthwise/Batc (None, 7, 7, 288) 1152 ['expanded_conv_8/depthwise[0][0]
hNorm (BatchNormalization) ']
tf.__operators__.add_18 (TFOpL (None, 7, 7, 288) 0 ['expanded_conv_8/depthwise/Batch
ambda) Norm[0][0]']
re_lu_23 (ReLU) (None, 7, 7, 288) 0 ['tf.__operators__.add_18[0][0]']
tf.math.multiply_18 (TFOpLambd (None, 7, 7, 288) 0 ['re_lu_23[0][0]']
a)
multiply_12 (Multiply) (None, 7, 7, 288) 0 ['expanded_conv_8/depthwise/Batch
Norm[0][0]',
'tf.math.multiply_18[0][0]']
expanded_conv_8/squeeze_excite (None, 1, 1, 288) 0 ['multiply_12[0][0]']
/AvgPool (GlobalAveragePooling
2D)
expanded_conv_8/squeeze_excite (None, 1, 1, 72) 20808 ['expanded_conv_8/squeeze_excite/
/Conv (Conv2D) AvgPool[0][0]']
expanded_conv_8/squeeze_excite (None, 1, 1, 72) 0 ['expanded_conv_8/squeeze_excite/
/Relu (ReLU) Conv[0][0]']
expanded_conv_8/squeeze_excite (None, 1, 1, 288) 21024 ['expanded_conv_8/squeeze_excite/
/Conv_1 (Conv2D) Relu[0][0]']
tf.__operators__.add_19 (TFOpL (None, 1, 1, 288) 0 ['expanded_conv_8/squeeze_excite/
ambda) Conv_1[0][0]']
re_lu_24 (ReLU) (None, 1, 1, 288) 0 ['tf.__operators__.add_19[0][0]']
tf.math.multiply_19 (TFOpLambd (None, 1, 1, 288) 0 ['re_lu_24[0][0]']
a)
expanded_conv_8/squeeze_excite (None, 7, 7, 288) 0 ['multiply_12[0][0]',
/Mul (Multiply) 'tf.math.multiply_19[0][0]']
expanded_conv_8/project (Conv2 (None, 7, 7, 96) 27648 ['expanded_conv_8/squeeze_excite/
D) Mul[0][0]']
expanded_conv_8/project/BatchN (None, 7, 7, 96) 384 ['expanded_conv_8/project[0][0]']
orm (BatchNormalization)
expanded_conv_9/expand (Conv2D (None, 7, 7, 576) 55296 ['expanded_conv_8/project/BatchNo
) rm[0][0]']
expanded_conv_9/expand/BatchNo (None, 7, 7, 576) 2304 ['expanded_conv_9/expand[0][0]']
rm (BatchNormalization)
tf.__operators__.add_20 (TFOpL (None, 7, 7, 576) 0 ['expanded_conv_9/expand/BatchNor
ambda) m[0][0]']
re_lu_25 (ReLU) (None, 7, 7, 576) 0 ['tf.__operators__.add_20[0][0]']
tf.math.multiply_20 (TFOpLambd (None, 7, 7, 576) 0 ['re_lu_25[0][0]']
a)
multiply_13 (Multiply) (None, 7, 7, 576) 0 ['expanded_conv_9/expand/BatchNor
m[0][0]',
'tf.math.multiply_20[0][0]']
expanded_conv_9/depthwise (Dep (None, 7, 7, 576) 14400 ['multiply_13[0][0]']
thwiseConv2D)
expanded_conv_9/depthwise/Batc (None, 7, 7, 576) 2304 ['expanded_conv_9/depthwise[0][0]
hNorm (BatchNormalization) ']
tf.__operators__.add_21 (TFOpL (None, 7, 7, 576) 0 ['expanded_conv_9/depthwise/Batch
ambda) Norm[0][0]']
re_lu_26 (ReLU) (None, 7, 7, 576) 0 ['tf.__operators__.add_21[0][0]']
tf.math.multiply_21 (TFOpLambd (None, 7, 7, 576) 0 ['re_lu_26[0][0]']
a)
multiply_14 (Multiply) (None, 7, 7, 576) 0 ['expanded_conv_9/depthwise/Batch
Norm[0][0]',
'tf.math.multiply_21[0][0]']
expanded_conv_9/squeeze_excite (None, 1, 1, 576) 0 ['multiply_14[0][0]']
/AvgPool (GlobalAveragePooling
2D)
expanded_conv_9/squeeze_excite (None, 1, 1, 144) 83088 ['expanded_conv_9/squeeze_excite/
/Conv (Conv2D) AvgPool[0][0]']
expanded_conv_9/squeeze_excite (None, 1, 1, 144) 0 ['expanded_conv_9/squeeze_excite/
/Relu (ReLU) Conv[0][0]']
expanded_conv_9/squeeze_excite (None, 1, 1, 576) 83520 ['expanded_conv_9/squeeze_excite/
/Conv_1 (Conv2D) Relu[0][0]']
tf.__operators__.add_22 (TFOpL (None, 1, 1, 576) 0 ['expanded_conv_9/squeeze_excite/
ambda) Conv_1[0][0]']
re_lu_27 (ReLU) (None, 1, 1, 576) 0 ['tf.__operators__.add_22[0][0]']
tf.math.multiply_22 (TFOpLambd (None, 1, 1, 576) 0 ['re_lu_27[0][0]']
a)
expanded_conv_9/squeeze_excite (None, 7, 7, 576) 0 ['multiply_14[0][0]',
/Mul (Multiply) 'tf.math.multiply_22[0][0]']
expanded_conv_9/project (Conv2 (None, 7, 7, 96) 55296 ['expanded_conv_9/squeeze_excite/
D) Mul[0][0]']
expanded_conv_9/project/BatchN (None, 7, 7, 96) 384 ['expanded_conv_9/project[0][0]']
orm (BatchNormalization)
expanded_conv_9/Add (Add) (None, 7, 7, 96) 0 ['expanded_conv_8/project/BatchNo
rm[0][0]',
'expanded_conv_9/project/BatchNo
rm[0][0]']
expanded_conv_10/expand (Conv2 (None, 7, 7, 576) 55296 ['expanded_conv_9/Add[0][0]']
D)
expanded_conv_10/expand/BatchN (None, 7, 7, 576) 2304 ['expanded_conv_10/expand[0][0]']
orm (BatchNormalization)
tf.__operators__.add_23 (TFOpL (None, 7, 7, 576) 0 ['expanded_conv_10/expand/BatchNo
ambda) rm[0][0]']
re_lu_28 (ReLU) (None, 7, 7, 576) 0 ['tf.__operators__.add_23[0][0]']
tf.math.multiply_23 (TFOpLambd (None, 7, 7, 576) 0 ['re_lu_28[0][0]']
a)
multiply_15 (Multiply) (None, 7, 7, 576) 0 ['expanded_conv_10/expand/BatchNo
rm[0][0]',
'tf.math.multiply_23[0][0]']
expanded_conv_10/depthwise (De (None, 7, 7, 576) 14400 ['multiply_15[0][0]']
pthwiseConv2D)
expanded_conv_10/depthwise/Bat (None, 7, 7, 576) 2304 ['expanded_conv_10/depthwise[0][0
chNorm (BatchNormalization) ]']
tf.__operators__.add_24 (TFOpL (None, 7, 7, 576) 0 ['expanded_conv_10/depthwise/Batc
ambda) hNorm[0][0]']
re_lu_29 (ReLU) (None, 7, 7, 576) 0 ['tf.__operators__.add_24[0][0]']
tf.math.multiply_24 (TFOpLambd (None, 7, 7, 576) 0 ['re_lu_29[0][0]']
a)
multiply_16 (Multiply) (None, 7, 7, 576) 0 ['expanded_conv_10/depthwise/Batc
hNorm[0][0]',
'tf.math.multiply_24[0][0]']
expanded_conv_10/squeeze_excit (None, 1, 1, 576) 0 ['multiply_16[0][0]']
e/AvgPool (GlobalAveragePoolin
g2D)
expanded_conv_10/squeeze_excit (None, 1, 1, 144) 83088 ['expanded_conv_10/squeeze_excite
e/Conv (Conv2D) /AvgPool[0][0]']
expanded_conv_10/squeeze_excit (None, 1, 1, 144) 0 ['expanded_conv_10/squeeze_excite
e/Relu (ReLU) /Conv[0][0]']
expanded_conv_10/squeeze_excit (None, 1, 1, 576) 83520 ['expanded_conv_10/squeeze_excite
e/Conv_1 (Conv2D) /Relu[0][0]']
tf.__operators__.add_25 (TFOpL (None, 1, 1, 576) 0 ['expanded_conv_10/squeeze_excite
ambda) /Conv_1[0][0]']
re_lu_30 (ReLU) (None, 1, 1, 576) 0 ['tf.__operators__.add_25[0][0]']
tf.math.multiply_25 (TFOpLambd (None, 1, 1, 576) 0 ['re_lu_30[0][0]']
a)
expanded_conv_10/squeeze_excit (None, 7, 7, 576) 0 ['multiply_16[0][0]',
e/Mul (Multiply) 'tf.math.multiply_25[0][0]']
expanded_conv_10/project (Conv (None, 7, 7, 96) 55296 ['expanded_conv_10/squeeze_excite
2D) /Mul[0][0]']
expanded_conv_10/project/Batch (None, 7, 7, 96) 384 ['expanded_conv_10/project[0][0]'
Norm (BatchNormalization) ]
expanded_conv_10/Add (Add) (None, 7, 7, 96) 0 ['expanded_conv_9/Add[0][0]',
'expanded_conv_10/project/BatchN
orm[0][0]']
Conv_1 (Conv2D) (None, 7, 7, 576) 55296 ['expanded_conv_10/Add[0][0]']
Conv_1/BatchNorm (BatchNormali (None, 7, 7, 576) 2304 ['Conv_1[0][0]']
zation)
tf.__operators__.add_26 (TFOpL (None, 7, 7, 576) 0 ['Conv_1/BatchNorm[0][0]']
ambda)
re_lu_31 (ReLU) (None, 7, 7, 576) 0 ['tf.__operators__.add_26[0][0]']
tf.math.multiply_26 (TFOpLambd (None, 7, 7, 576) 0 ['re_lu_31[0][0]']
a)
multiply_17 (Multiply) (None, 7, 7, 576) 0 ['Conv_1/BatchNorm[0][0]',
'tf.math.multiply_26[0][0]']
==================================================================================================
Total params: 939,120
Trainable params: 927,008
Non-trainable params: 12,112
__________________________________________________________________________________________________
# Instantiating Optimizer
learning_rate= 0.0001
opt4 = tf.keras.optimizers.Adam(learning_rate=learning_rate)
# Reduce Learning Rate on Plateau
reduce_lr4 = tf.keras.callbacks.ReduceLROnPlateau(monitor='val_loss', factor=0.1, patience=3, verbose=1, mode='auto', min_delta=0.0001)
# Logs directory
curr_run_logdir4 = get_run_logdir()
# Instantiating Tensorboard callback
tensorboard_callback4 = TensorBoard(log_dir=curr_run_logdir4, histogram_freq=1)
# Setting the seed
os.environ['PYTHONHASHSEED'] = '0'
# Clearing the TF session
tf.keras.backend.clear_session()
# defining the custom top of the MobileNet model
for layer in mobilenet_v3_small_with_no_top_model.layers:
layer.trainable = True
# Adding additional layers
input_layer = mobilenet_v3_small_with_no_top_model.output
# Defining the top layers structure of the model
flatten = tf.keras.layers.GlobalAveragePooling2D(name='Flatten_for_hidden_layers')(input_layer)
dropout_1 = Dropout(rate=0.5, name='Dropout1')(flatten)
dense_layer1 = tf.keras.layers.Dense(units=128,
activation='relu',
use_bias=True,
kernel_initializer=tf.keras.initializers.he_normal(seed=80),
bias_initializer=tf.keras.initializers.he_normal(seed=110),
name='Hidden_Layer1')(dropout_1)
dropout_2 = Dropout(rate=0.5, name='Dropout2')(dense_layer1)
dense_layer2 = tf.keras.layers.Dense(units=128,
activation='relu',
use_bias=True,
kernel_initializer=tf.keras.initializers.he_normal(seed=80),
bias_initializer=tf.keras.initializers.he_normal(seed=110),
name='Hidden_Layer2')(dropout_2)
dropout_3 = Dropout(rate=0.5, name='Dropout3')(dense_layer2)
dense_layer3 = tf.keras.layers.Dense(units=128,
activation='relu',
use_bias=True,
kernel_initializer=tf.keras.initializers.he_normal(seed=80),
bias_initializer=tf.keras.initializers.he_normal(seed=110),
name='Hidden_Layer3')(dropout_3)
output_layer = tf.keras.layers.Dense(4, activation='softmax', name="output")(dense_layer3)
# Instantiating the complete model
mobile_net_v3_sm = Model(inputs=mobilenet_v3_small_with_no_top_model.input, outputs=output_layer)
# Compiling the model
mobile_net_v3_sm.compile(optimizer=opt4,
loss = 'categorical_crossentropy',
metrics=['categorical_accuracy', tfa_f1_scr])
# Summary of the MobileNet V3 Small model with custom top
mobile_net_v3_sm.summary()
Model: "model"
__________________________________________________________________________________________________
Layer (type) Output Shape Param # Connected to
==================================================================================================
input_1 (InputLayer) [(None, 224, 224, 3 0 []
)]
rescaling (Rescaling) (None, 224, 224, 3) 0 ['input_1[0][0]']
Conv (Conv2D) (None, 112, 112, 16 432 ['rescaling[0][0]']
)
Conv/BatchNorm (BatchNormaliza (None, 112, 112, 16 64 ['Conv[0][0]']
tion) )
tf.__operators__.add (TFOpLamb (None, 112, 112, 16 0 ['Conv/BatchNorm[0][0]']
da) )
re_lu (ReLU) (None, 112, 112, 16 0 ['tf.__operators__.add[0][0]']
)
tf.math.multiply (TFOpLambda) (None, 112, 112, 16 0 ['re_lu[0][0]']
)
multiply (Multiply) (None, 112, 112, 16 0 ['Conv/BatchNorm[0][0]',
) 'tf.math.multiply[0][0]']
expanded_conv/depthwise/pad (Z (None, 113, 113, 16 0 ['multiply[0][0]']
eroPadding2D) )
expanded_conv/depthwise (Depth (None, 56, 56, 16) 144 ['expanded_conv/depthwise/pad[0][
wiseConv2D) 0]']
expanded_conv/depthwise/BatchN (None, 56, 56, 16) 64 ['expanded_conv/depthwise[0][0]']
orm (BatchNormalization)
re_lu_1 (ReLU) (None, 56, 56, 16) 0 ['expanded_conv/depthwise/BatchNo
rm[0][0]']
expanded_conv/squeeze_excite/A (None, 1, 1, 16) 0 ['re_lu_1[0][0]']
vgPool (GlobalAveragePooling2D
)
expanded_conv/squeeze_excite/C (None, 1, 1, 8) 136 ['expanded_conv/squeeze_excite/Av
onv (Conv2D) gPool[0][0]']
expanded_conv/squeeze_excite/R (None, 1, 1, 8) 0 ['expanded_conv/squeeze_excite/Co
elu (ReLU) nv[0][0]']
expanded_conv/squeeze_excite/C (None, 1, 1, 16) 144 ['expanded_conv/squeeze_excite/Re
onv_1 (Conv2D) lu[0][0]']
tf.__operators__.add_1 (TFOpLa (None, 1, 1, 16) 0 ['expanded_conv/squeeze_excite/Co
mbda) nv_1[0][0]']
re_lu_2 (ReLU) (None, 1, 1, 16) 0 ['tf.__operators__.add_1[0][0]']
tf.math.multiply_1 (TFOpLambda (None, 1, 1, 16) 0 ['re_lu_2[0][0]']
)
expanded_conv/squeeze_excite/M (None, 56, 56, 16) 0 ['re_lu_1[0][0]',
ul (Multiply) 'tf.math.multiply_1[0][0]']
expanded_conv/project (Conv2D) (None, 56, 56, 16) 256 ['expanded_conv/squeeze_excite/Mu
l[0][0]']
expanded_conv/project/BatchNor (None, 56, 56, 16) 64 ['expanded_conv/project[0][0]']
m (BatchNormalization)
expanded_conv_1/expand (Conv2D (None, 56, 56, 72) 1152 ['expanded_conv/project/BatchNorm
) [0][0]']
expanded_conv_1/expand/BatchNo (None, 56, 56, 72) 288 ['expanded_conv_1/expand[0][0]']
rm (BatchNormalization)
re_lu_3 (ReLU) (None, 56, 56, 72) 0 ['expanded_conv_1/expand/BatchNor
m[0][0]']
expanded_conv_1/depthwise/pad (None, 57, 57, 72) 0 ['re_lu_3[0][0]']
(ZeroPadding2D)
expanded_conv_1/depthwise (Dep (None, 28, 28, 72) 648 ['expanded_conv_1/depthwise/pad[0
thwiseConv2D) ][0]']
expanded_conv_1/depthwise/Batc (None, 28, 28, 72) 288 ['expanded_conv_1/depthwise[0][0]
hNorm (BatchNormalization) ']
re_lu_4 (ReLU) (None, 28, 28, 72) 0 ['expanded_conv_1/depthwise/Batch
Norm[0][0]']
expanded_conv_1/project (Conv2 (None, 28, 28, 24) 1728 ['re_lu_4[0][0]']
D)
expanded_conv_1/project/BatchN (None, 28, 28, 24) 96 ['expanded_conv_1/project[0][0]']
orm (BatchNormalization)
expanded_conv_2/expand (Conv2D (None, 28, 28, 88) 2112 ['expanded_conv_1/project/BatchNo
) rm[0][0]']
expanded_conv_2/expand/BatchNo (None, 28, 28, 88) 352 ['expanded_conv_2/expand[0][0]']
rm (BatchNormalization)
re_lu_5 (ReLU) (None, 28, 28, 88) 0 ['expanded_conv_2/expand/BatchNor
m[0][0]']
expanded_conv_2/depthwise (Dep (None, 28, 28, 88) 792 ['re_lu_5[0][0]']
thwiseConv2D)
expanded_conv_2/depthwise/Batc (None, 28, 28, 88) 352 ['expanded_conv_2/depthwise[0][0]
hNorm (BatchNormalization) ']
re_lu_6 (ReLU) (None, 28, 28, 88) 0 ['expanded_conv_2/depthwise/Batch
Norm[0][0]']
expanded_conv_2/project (Conv2 (None, 28, 28, 24) 2112 ['re_lu_6[0][0]']
D)
expanded_conv_2/project/BatchN (None, 28, 28, 24) 96 ['expanded_conv_2/project[0][0]']
orm (BatchNormalization)
expanded_conv_2/Add (Add) (None, 28, 28, 24) 0 ['expanded_conv_1/project/BatchNo
rm[0][0]',
'expanded_conv_2/project/BatchNo
rm[0][0]']
expanded_conv_3/expand (Conv2D (None, 28, 28, 96) 2304 ['expanded_conv_2/Add[0][0]']
)
expanded_conv_3/expand/BatchNo (None, 28, 28, 96) 384 ['expanded_conv_3/expand[0][0]']
rm (BatchNormalization)
tf.__operators__.add_2 (TFOpLa (None, 28, 28, 96) 0 ['expanded_conv_3/expand/BatchNor
mbda) m[0][0]']
re_lu_7 (ReLU) (None, 28, 28, 96) 0 ['tf.__operators__.add_2[0][0]']
tf.math.multiply_2 (TFOpLambda (None, 28, 28, 96) 0 ['re_lu_7[0][0]']
)
multiply_1 (Multiply) (None, 28, 28, 96) 0 ['expanded_conv_3/expand/BatchNor
m[0][0]',
'tf.math.multiply_2[0][0]']
expanded_conv_3/depthwise/pad (None, 31, 31, 96) 0 ['multiply_1[0][0]']
(ZeroPadding2D)
expanded_conv_3/depthwise (Dep (None, 14, 14, 96) 2400 ['expanded_conv_3/depthwise/pad[0
thwiseConv2D) ][0]']
expanded_conv_3/depthwise/Batc (None, 14, 14, 96) 384 ['expanded_conv_3/depthwise[0][0]
hNorm (BatchNormalization) ']
tf.__operators__.add_3 (TFOpLa (None, 14, 14, 96) 0 ['expanded_conv_3/depthwise/Batch
mbda) Norm[0][0]']
re_lu_8 (ReLU) (None, 14, 14, 96) 0 ['tf.__operators__.add_3[0][0]']
tf.math.multiply_3 (TFOpLambda (None, 14, 14, 96) 0 ['re_lu_8[0][0]']
)
multiply_2 (Multiply) (None, 14, 14, 96) 0 ['expanded_conv_3/depthwise/Batch
Norm[0][0]',
'tf.math.multiply_3[0][0]']
expanded_conv_3/squeeze_excite (None, 1, 1, 96) 0 ['multiply_2[0][0]']
/AvgPool (GlobalAveragePooling
2D)
expanded_conv_3/squeeze_excite (None, 1, 1, 24) 2328 ['expanded_conv_3/squeeze_excite/
/Conv (Conv2D) AvgPool[0][0]']
expanded_conv_3/squeeze_excite (None, 1, 1, 24) 0 ['expanded_conv_3/squeeze_excite/
/Relu (ReLU) Conv[0][0]']
expanded_conv_3/squeeze_excite (None, 1, 1, 96) 2400 ['expanded_conv_3/squeeze_excite/
/Conv_1 (Conv2D) Relu[0][0]']
tf.__operators__.add_4 (TFOpLa (None, 1, 1, 96) 0 ['expanded_conv_3/squeeze_excite/
mbda) Conv_1[0][0]']
re_lu_9 (ReLU) (None, 1, 1, 96) 0 ['tf.__operators__.add_4[0][0]']
tf.math.multiply_4 (TFOpLambda (None, 1, 1, 96) 0 ['re_lu_9[0][0]']
)
expanded_conv_3/squeeze_excite (None, 14, 14, 96) 0 ['multiply_2[0][0]',
/Mul (Multiply) 'tf.math.multiply_4[0][0]']
expanded_conv_3/project (Conv2 (None, 14, 14, 40) 3840 ['expanded_conv_3/squeeze_excite/
D) Mul[0][0]']
expanded_conv_3/project/BatchN (None, 14, 14, 40) 160 ['expanded_conv_3/project[0][0]']
orm (BatchNormalization)
expanded_conv_4/expand (Conv2D (None, 14, 14, 240) 9600 ['expanded_conv_3/project/BatchNo
) rm[0][0]']
expanded_conv_4/expand/BatchNo (None, 14, 14, 240) 960 ['expanded_conv_4/expand[0][0]']
rm (BatchNormalization)
tf.__operators__.add_5 (TFOpLa (None, 14, 14, 240) 0 ['expanded_conv_4/expand/BatchNor
mbda) m[0][0]']
re_lu_10 (ReLU) (None, 14, 14, 240) 0 ['tf.__operators__.add_5[0][0]']
tf.math.multiply_5 (TFOpLambda (None, 14, 14, 240) 0 ['re_lu_10[0][0]']
)
multiply_3 (Multiply) (None, 14, 14, 240) 0 ['expanded_conv_4/expand/BatchNor
m[0][0]',
'tf.math.multiply_5[0][0]']
expanded_conv_4/depthwise (Dep (None, 14, 14, 240) 6000 ['multiply_3[0][0]']
thwiseConv2D)
expanded_conv_4/depthwise/Batc (None, 14, 14, 240) 960 ['expanded_conv_4/depthwise[0][0]
hNorm (BatchNormalization) ']
tf.__operators__.add_6 (TFOpLa (None, 14, 14, 240) 0 ['expanded_conv_4/depthwise/Batch
mbda) Norm[0][0]']
re_lu_11 (ReLU) (None, 14, 14, 240) 0 ['tf.__operators__.add_6[0][0]']
tf.math.multiply_6 (TFOpLambda (None, 14, 14, 240) 0 ['re_lu_11[0][0]']
)
multiply_4 (Multiply) (None, 14, 14, 240) 0 ['expanded_conv_4/depthwise/Batch
Norm[0][0]',
'tf.math.multiply_6[0][0]']
expanded_conv_4/squeeze_excite (None, 1, 1, 240) 0 ['multiply_4[0][0]']
/AvgPool (GlobalAveragePooling
2D)
expanded_conv_4/squeeze_excite (None, 1, 1, 64) 15424 ['expanded_conv_4/squeeze_excite/
/Conv (Conv2D) AvgPool[0][0]']
expanded_conv_4/squeeze_excite (None, 1, 1, 64) 0 ['expanded_conv_4/squeeze_excite/
/Relu (ReLU) Conv[0][0]']
expanded_conv_4/squeeze_excite (None, 1, 1, 240) 15600 ['expanded_conv_4/squeeze_excite/
/Conv_1 (Conv2D) Relu[0][0]']
tf.__operators__.add_7 (TFOpLa (None, 1, 1, 240) 0 ['expanded_conv_4/squeeze_excite/
mbda) Conv_1[0][0]']
re_lu_12 (ReLU) (None, 1, 1, 240) 0 ['tf.__operators__.add_7[0][0]']
tf.math.multiply_7 (TFOpLambda (None, 1, 1, 240) 0 ['re_lu_12[0][0]']
)
expanded_conv_4/squeeze_excite (None, 14, 14, 240) 0 ['multiply_4[0][0]',
/Mul (Multiply) 'tf.math.multiply_7[0][0]']
expanded_conv_4/project (Conv2 (None, 14, 14, 40) 9600 ['expanded_conv_4/squeeze_excite/
D) Mul[0][0]']
expanded_conv_4/project/BatchN (None, 14, 14, 40) 160 ['expanded_conv_4/project[0][0]']
orm (BatchNormalization)
expanded_conv_4/Add (Add) (None, 14, 14, 40) 0 ['expanded_conv_3/project/BatchNo
rm[0][0]',
'expanded_conv_4/project/BatchNo
rm[0][0]']
expanded_conv_5/expand (Conv2D (None, 14, 14, 240) 9600 ['expanded_conv_4/Add[0][0]']
)
expanded_conv_5/expand/BatchNo (None, 14, 14, 240) 960 ['expanded_conv_5/expand[0][0]']
rm (BatchNormalization)
tf.__operators__.add_8 (TFOpLa (None, 14, 14, 240) 0 ['expanded_conv_5/expand/BatchNor
mbda) m[0][0]']
re_lu_13 (ReLU) (None, 14, 14, 240) 0 ['tf.__operators__.add_8[0][0]']
tf.math.multiply_8 (TFOpLambda (None, 14, 14, 240) 0 ['re_lu_13[0][0]']
)
multiply_5 (Multiply) (None, 14, 14, 240) 0 ['expanded_conv_5/expand/BatchNor
m[0][0]',
'tf.math.multiply_8[0][0]']
expanded_conv_5/depthwise (Dep (None, 14, 14, 240) 6000 ['multiply_5[0][0]']
thwiseConv2D)
expanded_conv_5/depthwise/Batc (None, 14, 14, 240) 960 ['expanded_conv_5/depthwise[0][0]
hNorm (BatchNormalization) ']
tf.__operators__.add_9 (TFOpLa (None, 14, 14, 240) 0 ['expanded_conv_5/depthwise/Batch
mbda) Norm[0][0]']
re_lu_14 (ReLU) (None, 14, 14, 240) 0 ['tf.__operators__.add_9[0][0]']
tf.math.multiply_9 (TFOpLambda (None, 14, 14, 240) 0 ['re_lu_14[0][0]']
)
multiply_6 (Multiply) (None, 14, 14, 240) 0 ['expanded_conv_5/depthwise/Batch
Norm[0][0]',
'tf.math.multiply_9[0][0]']
expanded_conv_5/squeeze_excite (None, 1, 1, 240) 0 ['multiply_6[0][0]']
/AvgPool (GlobalAveragePooling
2D)
expanded_conv_5/squeeze_excite (None, 1, 1, 64) 15424 ['expanded_conv_5/squeeze_excite/
/Conv (Conv2D) AvgPool[0][0]']
expanded_conv_5/squeeze_excite (None, 1, 1, 64) 0 ['expanded_conv_5/squeeze_excite/
/Relu (ReLU) Conv[0][0]']
expanded_conv_5/squeeze_excite (None, 1, 1, 240) 15600 ['expanded_conv_5/squeeze_excite/
/Conv_1 (Conv2D) Relu[0][0]']
tf.__operators__.add_10 (TFOpL (None, 1, 1, 240) 0 ['expanded_conv_5/squeeze_excite/
ambda) Conv_1[0][0]']
re_lu_15 (ReLU) (None, 1, 1, 240) 0 ['tf.__operators__.add_10[0][0]']
tf.math.multiply_10 (TFOpLambd (None, 1, 1, 240) 0 ['re_lu_15[0][0]']
a)
expanded_conv_5/squeeze_excite (None, 14, 14, 240) 0 ['multiply_6[0][0]',
/Mul (Multiply) 'tf.math.multiply_10[0][0]']
expanded_conv_5/project (Conv2 (None, 14, 14, 40) 9600 ['expanded_conv_5/squeeze_excite/
D) Mul[0][0]']
expanded_conv_5/project/BatchN (None, 14, 14, 40) 160 ['expanded_conv_5/project[0][0]']
orm (BatchNormalization)
expanded_conv_5/Add (Add) (None, 14, 14, 40) 0 ['expanded_conv_4/Add[0][0]',
'expanded_conv_5/project/BatchNo
rm[0][0]']
expanded_conv_6/expand (Conv2D (None, 14, 14, 120) 4800 ['expanded_conv_5/Add[0][0]']
)
expanded_conv_6/expand/BatchNo (None, 14, 14, 120) 480 ['expanded_conv_6/expand[0][0]']
rm (BatchNormalization)
tf.__operators__.add_11 (TFOpL (None, 14, 14, 120) 0 ['expanded_conv_6/expand/BatchNor
ambda) m[0][0]']
re_lu_16 (ReLU) (None, 14, 14, 120) 0 ['tf.__operators__.add_11[0][0]']
tf.math.multiply_11 (TFOpLambd (None, 14, 14, 120) 0 ['re_lu_16[0][0]']
a)
multiply_7 (Multiply) (None, 14, 14, 120) 0 ['expanded_conv_6/expand/BatchNor
m[0][0]',
'tf.math.multiply_11[0][0]']
expanded_conv_6/depthwise (Dep (None, 14, 14, 120) 3000 ['multiply_7[0][0]']
thwiseConv2D)
expanded_conv_6/depthwise/Batc (None, 14, 14, 120) 480 ['expanded_conv_6/depthwise[0][0]
hNorm (BatchNormalization) ']
tf.__operators__.add_12 (TFOpL (None, 14, 14, 120) 0 ['expanded_conv_6/depthwise/Batch
ambda) Norm[0][0]']
re_lu_17 (ReLU) (None, 14, 14, 120) 0 ['tf.__operators__.add_12[0][0]']
tf.math.multiply_12 (TFOpLambd (None, 14, 14, 120) 0 ['re_lu_17[0][0]']
a)
multiply_8 (Multiply) (None, 14, 14, 120) 0 ['expanded_conv_6/depthwise/Batch
Norm[0][0]',
'tf.math.multiply_12[0][0]']
expanded_conv_6/squeeze_excite (None, 1, 1, 120) 0 ['multiply_8[0][0]']
/AvgPool (GlobalAveragePooling
2D)
expanded_conv_6/squeeze_excite (None, 1, 1, 32) 3872 ['expanded_conv_6/squeeze_excite/
/Conv (Conv2D) AvgPool[0][0]']
expanded_conv_6/squeeze_excite (None, 1, 1, 32) 0 ['expanded_conv_6/squeeze_excite/
/Relu (ReLU) Conv[0][0]']
expanded_conv_6/squeeze_excite (None, 1, 1, 120) 3960 ['expanded_conv_6/squeeze_excite/
/Conv_1 (Conv2D) Relu[0][0]']
tf.__operators__.add_13 (TFOpL (None, 1, 1, 120) 0 ['expanded_conv_6/squeeze_excite/
ambda) Conv_1[0][0]']
re_lu_18 (ReLU) (None, 1, 1, 120) 0 ['tf.__operators__.add_13[0][0]']
tf.math.multiply_13 (TFOpLambd (None, 1, 1, 120) 0 ['re_lu_18[0][0]']
a)
expanded_conv_6/squeeze_excite (None, 14, 14, 120) 0 ['multiply_8[0][0]',
/Mul (Multiply) 'tf.math.multiply_13[0][0]']
expanded_conv_6/project (Conv2 (None, 14, 14, 48) 5760 ['expanded_conv_6/squeeze_excite/
D) Mul[0][0]']
expanded_conv_6/project/BatchN (None, 14, 14, 48) 192 ['expanded_conv_6/project[0][0]']
orm (BatchNormalization)
expanded_conv_7/expand (Conv2D (None, 14, 14, 144) 6912 ['expanded_conv_6/project/BatchNo
) rm[0][0]']
expanded_conv_7/expand/BatchNo (None, 14, 14, 144) 576 ['expanded_conv_7/expand[0][0]']
rm (BatchNormalization)
tf.__operators__.add_14 (TFOpL (None, 14, 14, 144) 0 ['expanded_conv_7/expand/BatchNor
ambda) m[0][0]']
re_lu_19 (ReLU) (None, 14, 14, 144) 0 ['tf.__operators__.add_14[0][0]']
tf.math.multiply_14 (TFOpLambd (None, 14, 14, 144) 0 ['re_lu_19[0][0]']
a)
multiply_9 (Multiply) (None, 14, 14, 144) 0 ['expanded_conv_7/expand/BatchNor
m[0][0]',
'tf.math.multiply_14[0][0]']
expanded_conv_7/depthwise (Dep (None, 14, 14, 144) 3600 ['multiply_9[0][0]']
thwiseConv2D)
expanded_conv_7/depthwise/Batc (None, 14, 14, 144) 576 ['expanded_conv_7/depthwise[0][0]
hNorm (BatchNormalization) ']
tf.__operators__.add_15 (TFOpL (None, 14, 14, 144) 0 ['expanded_conv_7/depthwise/Batch
ambda) Norm[0][0]']
re_lu_20 (ReLU) (None, 14, 14, 144) 0 ['tf.__operators__.add_15[0][0]']
tf.math.multiply_15 (TFOpLambd (None, 14, 14, 144) 0 ['re_lu_20[0][0]']
a)
multiply_10 (Multiply) (None, 14, 14, 144) 0 ['expanded_conv_7/depthwise/Batch
Norm[0][0]',
'tf.math.multiply_15[0][0]']
expanded_conv_7/squeeze_excite (None, 1, 1, 144) 0 ['multiply_10[0][0]']
/AvgPool (GlobalAveragePooling
2D)
expanded_conv_7/squeeze_excite (None, 1, 1, 40) 5800 ['expanded_conv_7/squeeze_excite/
/Conv (Conv2D) AvgPool[0][0]']
expanded_conv_7/squeeze_excite (None, 1, 1, 40) 0 ['expanded_conv_7/squeeze_excite/
/Relu (ReLU) Conv[0][0]']
expanded_conv_7/squeeze_excite (None, 1, 1, 144) 5904 ['expanded_conv_7/squeeze_excite/
/Conv_1 (Conv2D) Relu[0][0]']
tf.__operators__.add_16 (TFOpL (None, 1, 1, 144) 0 ['expanded_conv_7/squeeze_excite/
ambda) Conv_1[0][0]']
re_lu_21 (ReLU) (None, 1, 1, 144) 0 ['tf.__operators__.add_16[0][0]']
tf.math.multiply_16 (TFOpLambd (None, 1, 1, 144) 0 ['re_lu_21[0][0]']
a)
expanded_conv_7/squeeze_excite (None, 14, 14, 144) 0 ['multiply_10[0][0]',
/Mul (Multiply) 'tf.math.multiply_16[0][0]']
expanded_conv_7/project (Conv2 (None, 14, 14, 48) 6912 ['expanded_conv_7/squeeze_excite/
D) Mul[0][0]']
expanded_conv_7/project/BatchN (None, 14, 14, 48) 192 ['expanded_conv_7/project[0][0]']
orm (BatchNormalization)
expanded_conv_7/Add (Add) (None, 14, 14, 48) 0 ['expanded_conv_6/project/BatchNo
rm[0][0]',
'expanded_conv_7/project/BatchNo
rm[0][0]']
expanded_conv_8/expand (Conv2D (None, 14, 14, 288) 13824 ['expanded_conv_7/Add[0][0]']
)
expanded_conv_8/expand/BatchNo (None, 14, 14, 288) 1152 ['expanded_conv_8/expand[0][0]']
rm (BatchNormalization)
tf.__operators__.add_17 (TFOpL (None, 14, 14, 288) 0 ['expanded_conv_8/expand/BatchNor
ambda) m[0][0]']
re_lu_22 (ReLU) (None, 14, 14, 288) 0 ['tf.__operators__.add_17[0][0]']
tf.math.multiply_17 (TFOpLambd (None, 14, 14, 288) 0 ['re_lu_22[0][0]']
a)
multiply_11 (Multiply) (None, 14, 14, 288) 0 ['expanded_conv_8/expand/BatchNor
m[0][0]',
'tf.math.multiply_17[0][0]']
expanded_conv_8/depthwise/pad (None, 17, 17, 288) 0 ['multiply_11[0][0]']
(ZeroPadding2D)
expanded_conv_8/depthwise (Dep (None, 7, 7, 288) 7200 ['expanded_conv_8/depthwise/pad[0
thwiseConv2D) ][0]']
expanded_conv_8/depthwise/Batc (None, 7, 7, 288) 1152 ['expanded_conv_8/depthwise[0][0]
hNorm (BatchNormalization) ']
tf.__operators__.add_18 (TFOpL (None, 7, 7, 288) 0 ['expanded_conv_8/depthwise/Batch
ambda) Norm[0][0]']
re_lu_23 (ReLU) (None, 7, 7, 288) 0 ['tf.__operators__.add_18[0][0]']
tf.math.multiply_18 (TFOpLambd (None, 7, 7, 288) 0 ['re_lu_23[0][0]']
a)
multiply_12 (Multiply) (None, 7, 7, 288) 0 ['expanded_conv_8/depthwise/Batch
Norm[0][0]',
'tf.math.multiply_18[0][0]']
expanded_conv_8/squeeze_excite (None, 1, 1, 288) 0 ['multiply_12[0][0]']
/AvgPool (GlobalAveragePooling
2D)
expanded_conv_8/squeeze_excite (None, 1, 1, 72) 20808 ['expanded_conv_8/squeeze_excite/
/Conv (Conv2D) AvgPool[0][0]']
expanded_conv_8/squeeze_excite (None, 1, 1, 72) 0 ['expanded_conv_8/squeeze_excite/
/Relu (ReLU) Conv[0][0]']
expanded_conv_8/squeeze_excite (None, 1, 1, 288) 21024 ['expanded_conv_8/squeeze_excite/
/Conv_1 (Conv2D) Relu[0][0]']
tf.__operators__.add_19 (TFOpL (None, 1, 1, 288) 0 ['expanded_conv_8/squeeze_excite/
ambda) Conv_1[0][0]']
re_lu_24 (ReLU) (None, 1, 1, 288) 0 ['tf.__operators__.add_19[0][0]']
tf.math.multiply_19 (TFOpLambd (None, 1, 1, 288) 0 ['re_lu_24[0][0]']
a)
expanded_conv_8/squeeze_excite (None, 7, 7, 288) 0 ['multiply_12[0][0]',
/Mul (Multiply) 'tf.math.multiply_19[0][0]']
expanded_conv_8/project (Conv2 (None, 7, 7, 96) 27648 ['expanded_conv_8/squeeze_excite/
D) Mul[0][0]']
expanded_conv_8/project/BatchN (None, 7, 7, 96) 384 ['expanded_conv_8/project[0][0]']
orm (BatchNormalization)
expanded_conv_9/expand (Conv2D (None, 7, 7, 576) 55296 ['expanded_conv_8/project/BatchNo
) rm[0][0]']
expanded_conv_9/expand/BatchNo (None, 7, 7, 576) 2304 ['expanded_conv_9/expand[0][0]']
rm (BatchNormalization)
tf.__operators__.add_20 (TFOpL (None, 7, 7, 576) 0 ['expanded_conv_9/expand/BatchNor
ambda) m[0][0]']
re_lu_25 (ReLU) (None, 7, 7, 576) 0 ['tf.__operators__.add_20[0][0]']
tf.math.multiply_20 (TFOpLambd (None, 7, 7, 576) 0 ['re_lu_25[0][0]']
a)
multiply_13 (Multiply) (None, 7, 7, 576) 0 ['expanded_conv_9/expand/BatchNor
m[0][0]',
'tf.math.multiply_20[0][0]']
expanded_conv_9/depthwise (Dep (None, 7, 7, 576) 14400 ['multiply_13[0][0]']
thwiseConv2D)
expanded_conv_9/depthwise/Batc (None, 7, 7, 576) 2304 ['expanded_conv_9/depthwise[0][0]
hNorm (BatchNormalization) ']
tf.__operators__.add_21 (TFOpL (None, 7, 7, 576) 0 ['expanded_conv_9/depthwise/Batch
ambda) Norm[0][0]']
re_lu_26 (ReLU) (None, 7, 7, 576) 0 ['tf.__operators__.add_21[0][0]']
tf.math.multiply_21 (TFOpLambd (None, 7, 7, 576) 0 ['re_lu_26[0][0]']
a)
multiply_14 (Multiply) (None, 7, 7, 576) 0 ['expanded_conv_9/depthwise/Batch
Norm[0][0]',
'tf.math.multiply_21[0][0]']
expanded_conv_9/squeeze_excite (None, 1, 1, 576) 0 ['multiply_14[0][0]']
/AvgPool (GlobalAveragePooling
2D)
expanded_conv_9/squeeze_excite (None, 1, 1, 144) 83088 ['expanded_conv_9/squeeze_excite/
/Conv (Conv2D) AvgPool[0][0]']
expanded_conv_9/squeeze_excite (None, 1, 1, 144) 0 ['expanded_conv_9/squeeze_excite/
/Relu (ReLU) Conv[0][0]']
expanded_conv_9/squeeze_excite (None, 1, 1, 576) 83520 ['expanded_conv_9/squeeze_excite/
/Conv_1 (Conv2D) Relu[0][0]']
tf.__operators__.add_22 (TFOpL (None, 1, 1, 576) 0 ['expanded_conv_9/squeeze_excite/
ambda) Conv_1[0][0]']
re_lu_27 (ReLU) (None, 1, 1, 576) 0 ['tf.__operators__.add_22[0][0]']
tf.math.multiply_22 (TFOpLambd (None, 1, 1, 576) 0 ['re_lu_27[0][0]']
a)
expanded_conv_9/squeeze_excite (None, 7, 7, 576) 0 ['multiply_14[0][0]',
/Mul (Multiply) 'tf.math.multiply_22[0][0]']
expanded_conv_9/project (Conv2 (None, 7, 7, 96) 55296 ['expanded_conv_9/squeeze_excite/
D) Mul[0][0]']
expanded_conv_9/project/BatchN (None, 7, 7, 96) 384 ['expanded_conv_9/project[0][0]']
orm (BatchNormalization)
expanded_conv_9/Add (Add) (None, 7, 7, 96) 0 ['expanded_conv_8/project/BatchNo
rm[0][0]',
'expanded_conv_9/project/BatchNo
rm[0][0]']
expanded_conv_10/expand (Conv2 (None, 7, 7, 576) 55296 ['expanded_conv_9/Add[0][0]']
D)
expanded_conv_10/expand/BatchN (None, 7, 7, 576) 2304 ['expanded_conv_10/expand[0][0]']
orm (BatchNormalization)
tf.__operators__.add_23 (TFOpL (None, 7, 7, 576) 0 ['expanded_conv_10/expand/BatchNo
ambda) rm[0][0]']
re_lu_28 (ReLU) (None, 7, 7, 576) 0 ['tf.__operators__.add_23[0][0]']
tf.math.multiply_23 (TFOpLambd (None, 7, 7, 576) 0 ['re_lu_28[0][0]']
a)
multiply_15 (Multiply) (None, 7, 7, 576) 0 ['expanded_conv_10/expand/BatchNo
rm[0][0]',
'tf.math.multiply_23[0][0]']
expanded_conv_10/depthwise (De (None, 7, 7, 576) 14400 ['multiply_15[0][0]']
pthwiseConv2D)
expanded_conv_10/depthwise/Bat (None, 7, 7, 576) 2304 ['expanded_conv_10/depthwise[0][0
chNorm (BatchNormalization) ]']
tf.__operators__.add_24 (TFOpL (None, 7, 7, 576) 0 ['expanded_conv_10/depthwise/Batc
ambda) hNorm[0][0]']
re_lu_29 (ReLU) (None, 7, 7, 576) 0 ['tf.__operators__.add_24[0][0]']
tf.math.multiply_24 (TFOpLambd (None, 7, 7, 576) 0 ['re_lu_29[0][0]']
a)
multiply_16 (Multiply) (None, 7, 7, 576) 0 ['expanded_conv_10/depthwise/Batc
hNorm[0][0]',
'tf.math.multiply_24[0][0]']
expanded_conv_10/squeeze_excit (None, 1, 1, 576) 0 ['multiply_16[0][0]']
e/AvgPool (GlobalAveragePoolin
g2D)
expanded_conv_10/squeeze_excit (None, 1, 1, 144) 83088 ['expanded_conv_10/squeeze_excite
e/Conv (Conv2D) /AvgPool[0][0]']
expanded_conv_10/squeeze_excit (None, 1, 1, 144) 0 ['expanded_conv_10/squeeze_excite
e/Relu (ReLU) /Conv[0][0]']
expanded_conv_10/squeeze_excit (None, 1, 1, 576) 83520 ['expanded_conv_10/squeeze_excite
e/Conv_1 (Conv2D) /Relu[0][0]']
tf.__operators__.add_25 (TFOpL (None, 1, 1, 576) 0 ['expanded_conv_10/squeeze_excite
ambda) /Conv_1[0][0]']
re_lu_30 (ReLU) (None, 1, 1, 576) 0 ['tf.__operators__.add_25[0][0]']
tf.math.multiply_25 (TFOpLambd (None, 1, 1, 576) 0 ['re_lu_30[0][0]']
a)
expanded_conv_10/squeeze_excit (None, 7, 7, 576) 0 ['multiply_16[0][0]',
e/Mul (Multiply) 'tf.math.multiply_25[0][0]']
expanded_conv_10/project (Conv (None, 7, 7, 96) 55296 ['expanded_conv_10/squeeze_excite
2D) /Mul[0][0]']
expanded_conv_10/project/Batch (None, 7, 7, 96) 384 ['expanded_conv_10/project[0][0]'
Norm (BatchNormalization) ]
expanded_conv_10/Add (Add) (None, 7, 7, 96) 0 ['expanded_conv_9/Add[0][0]',
'expanded_conv_10/project/BatchN
orm[0][0]']
Conv_1 (Conv2D) (None, 7, 7, 576) 55296 ['expanded_conv_10/Add[0][0]']
Conv_1/BatchNorm (BatchNormali (None, 7, 7, 576) 2304 ['Conv_1[0][0]']
zation)
tf.__operators__.add_26 (TFOpL (None, 7, 7, 576) 0 ['Conv_1/BatchNorm[0][0]']
ambda)
re_lu_31 (ReLU) (None, 7, 7, 576) 0 ['tf.__operators__.add_26[0][0]']
tf.math.multiply_26 (TFOpLambd (None, 7, 7, 576) 0 ['re_lu_31[0][0]']
a)
multiply_17 (Multiply) (None, 7, 7, 576) 0 ['Conv_1/BatchNorm[0][0]',
'tf.math.multiply_26[0][0]']
Flatten_for_hidden_layers (Glo (None, 576) 0 ['multiply_17[0][0]']
balAveragePooling2D)
Dropout1 (Dropout) (None, 576) 0 ['Flatten_for_hidden_layers[0][0]
']
Hidden_Layer1 (Dense) (None, 128) 73856 ['Dropout1[0][0]']
Dropout2 (Dropout) (None, 128) 0 ['Hidden_Layer1[0][0]']
Hidden_Layer2 (Dense) (None, 128) 16512 ['Dropout2[0][0]']
Dropout3 (Dropout) (None, 128) 0 ['Hidden_Layer2[0][0]']
Hidden_Layer3 (Dense) (None, 128) 16512 ['Dropout3[0][0]']
output (Dense) (None, 4) 516 ['Hidden_Layer3[0][0]']
==================================================================================================
Total params: 1,046,516
Trainable params: 1,034,404
Non-trainable params: 12,112
__________________________________________________________________________________________________
# plotting the model
plot_model(mobile_net_v3_sm, to_file='mobile_net_v3_small.png', show_shapes=True, show_layer_names=True)
tf.autograph.experimental.do_not_convert(func=None)
<function tensorflow.python.autograph.impl.api.do_not_convert(func=None)>
# Size of TRAIN & VALIDATION labels and BATCH SIZE
y_train.shape[0], BATCH_SIZE, y_val.shape[0]
(11138, 32, 1966)
# Calculating train steps
train_steps = y_train.shape[0] // BATCH_SIZE
train_steps
348
# Calculating test steps
valid_steps = y_val.shape[0] // BATCH_SIZE
valid_steps
61
cw1_dict
{0: 0.8862189688096753,
1: 4.954626334519573,
2: 0.7281642259414226,
3: 0.7713296398891967}
# Training the MobileNet V3 Small model having custom top
history4 = mobile_net_v3_sm.fit(X_train, y_train,
epochs=40,
batch_size=BATCH_SIZE,
callbacks=[tensorboard_callback4, reduce_lr4],
steps_per_epoch=train_steps,
validation_steps=valid_steps,
validation_data=[X_val, y_val],
class_weight=cw1_dict,
verbose=1)
Epoch 1/40 348/348 [==============================] - 42s 77ms/step - loss: 1.6770 - categorical_accuracy: 0.2493 - f1_score: 0.2314 - val_loss: 1.4507 - val_categorical_accuracy: 0.0487 - val_f1_score: 0.0232 - lr: 1.0000e-04 Epoch 2/40 348/348 [==============================] - 26s 72ms/step - loss: 1.4149 - categorical_accuracy: 0.2771 - f1_score: 0.2563 - val_loss: 1.3933 - val_categorical_accuracy: 0.3294 - val_f1_score: 0.1240 - lr: 1.0000e-04 Epoch 3/40 348/348 [==============================] - 25s 72ms/step - loss: 1.3571 - categorical_accuracy: 0.3398 - f1_score: 0.3135 - val_loss: 1.3770 - val_categorical_accuracy: 0.3294 - val_f1_score: 0.1239 - lr: 1.0000e-04 Epoch 4/40 348/348 [==============================] - 25s 71ms/step - loss: 1.2105 - categorical_accuracy: 0.4976 - f1_score: 0.4403 - val_loss: 1.4176 - val_categorical_accuracy: 0.3294 - val_f1_score: 0.1239 - lr: 1.0000e-04 Epoch 5/40 348/348 [==============================] - 25s 71ms/step - loss: 0.9261 - categorical_accuracy: 0.6857 - f1_score: 0.6019 - val_loss: 1.3868 - val_categorical_accuracy: 0.2920 - val_f1_score: 0.1130 - lr: 1.0000e-04 Epoch 6/40 348/348 [==============================] - ETA: 0s - loss: 0.6620 - categorical_accuracy: 0.8032 - f1_score: 0.7152 Epoch 6: ReduceLROnPlateau reducing learning rate to 9.999999747378752e-06. 348/348 [==============================] - 25s 71ms/step - loss: 0.6620 - categorical_accuracy: 0.8032 - f1_score: 0.7152 - val_loss: 1.8251 - val_categorical_accuracy: 0.2920 - val_f1_score: 0.1130 - lr: 1.0000e-04 Epoch 7/40 348/348 [==============================] - 25s 71ms/step - loss: 0.5212 - categorical_accuracy: 0.8249 - f1_score: 0.7506 - val_loss: 1.9601 - val_categorical_accuracy: 0.2920 - val_f1_score: 0.1130 - lr: 1.0000e-05 Epoch 8/40 348/348 [==============================] - 25s 71ms/step - loss: 0.4883 - categorical_accuracy: 0.8514 - f1_score: 0.7762 - val_loss: 1.8498 - val_categorical_accuracy: 0.2920 - val_f1_score: 0.1130 - lr: 1.0000e-05 Epoch 9/40 348/348 [==============================] - ETA: 0s - loss: 0.4610 - categorical_accuracy: 0.8583 - f1_score: 0.7845 Epoch 9: ReduceLROnPlateau reducing learning rate to 9.999999747378752e-07. 348/348 [==============================] - 25s 71ms/step - loss: 0.4610 - categorical_accuracy: 0.8583 - f1_score: 0.7845 - val_loss: 1.9019 - val_categorical_accuracy: 0.2920 - val_f1_score: 0.1131 - lr: 1.0000e-05 Epoch 10/40 348/348 [==============================] - 25s 71ms/step - loss: 0.4525 - categorical_accuracy: 0.8611 - f1_score: 0.7887 - val_loss: 1.9263 - val_categorical_accuracy: 0.2925 - val_f1_score: 0.1141 - lr: 1.0000e-06 Epoch 11/40 348/348 [==============================] - 25s 71ms/step - loss: 0.4401 - categorical_accuracy: 0.8614 - f1_score: 0.7899 - val_loss: 2.0199 - val_categorical_accuracy: 0.3043 - val_f1_score: 0.1421 - lr: 1.0000e-06 Epoch 12/40 348/348 [==============================] - ETA: 0s - loss: 0.4366 - categorical_accuracy: 0.8643 - f1_score: 0.7913 Epoch 12: ReduceLROnPlateau reducing learning rate to 9.999999974752428e-08. 348/348 [==============================] - 25s 71ms/step - loss: 0.4366 - categorical_accuracy: 0.8643 - f1_score: 0.7913 - val_loss: 2.8436 - val_categorical_accuracy: 0.2987 - val_f1_score: 0.1268 - lr: 1.0000e-06 Epoch 13/40 348/348 [==============================] - 25s 71ms/step - loss: 0.4412 - categorical_accuracy: 0.8646 - f1_score: 0.7907 - val_loss: 3.0946 - val_categorical_accuracy: 0.3074 - val_f1_score: 0.1446 - lr: 1.0000e-07 Epoch 14/40 348/348 [==============================] - 25s 71ms/step - loss: 0.4637 - categorical_accuracy: 0.8639 - f1_score: 0.7907 - val_loss: 3.3901 - val_categorical_accuracy: 0.3248 - val_f1_score: 0.1690 - lr: 1.0000e-07 Epoch 15/40 348/348 [==============================] - ETA: 0s - loss: 0.4430 - categorical_accuracy: 0.8627 - f1_score: 0.7907 Epoch 15: ReduceLROnPlateau reducing learning rate to 1.0000000116860975e-08. 348/348 [==============================] - 25s 71ms/step - loss: 0.4430 - categorical_accuracy: 0.8627 - f1_score: 0.7907 - val_loss: 3.4541 - val_categorical_accuracy: 0.3540 - val_f1_score: 0.2130 - lr: 1.0000e-07 Epoch 16/40 348/348 [==============================] - 25s 71ms/step - loss: 0.4490 - categorical_accuracy: 0.8639 - f1_score: 0.7910 - val_loss: 3.3442 - val_categorical_accuracy: 0.3919 - val_f1_score: 0.2698 - lr: 1.0000e-08 Epoch 17/40 348/348 [==============================] - 25s 71ms/step - loss: 0.4484 - categorical_accuracy: 0.8593 - f1_score: 0.7859 - val_loss: 2.9835 - val_categorical_accuracy: 0.4559 - val_f1_score: 0.3584 - lr: 1.0000e-08 Epoch 18/40 348/348 [==============================] - ETA: 0s - loss: 0.4277 - categorical_accuracy: 0.8658 - f1_score: 0.7944 Epoch 18: ReduceLROnPlateau reducing learning rate to 9.999999939225292e-10. 348/348 [==============================] - 25s 72ms/step - loss: 0.4277 - categorical_accuracy: 0.8658 - f1_score: 0.7944 - val_loss: 2.4206 - val_categorical_accuracy: 0.5369 - val_f1_score: 0.4508 - lr: 1.0000e-08 Epoch 19/40 348/348 [==============================] - 25s 72ms/step - loss: 0.4587 - categorical_accuracy: 0.8585 - f1_score: 0.7860 - val_loss: 1.7937 - val_categorical_accuracy: 0.6168 - val_f1_score: 0.5269 - lr: 1.0000e-09 Epoch 20/40 348/348 [==============================] - 25s 72ms/step - loss: 0.4394 - categorical_accuracy: 0.8665 - f1_score: 0.7919 - val_loss: 1.2480 - val_categorical_accuracy: 0.6947 - val_f1_score: 0.6072 - lr: 1.0000e-09 Epoch 21/40 348/348 [==============================] - 25s 72ms/step - loss: 0.4574 - categorical_accuracy: 0.8632 - f1_score: 0.7889 - val_loss: 0.8509 - val_categorical_accuracy: 0.7695 - val_f1_score: 0.6782 - lr: 1.0000e-09 Epoch 22/40 348/348 [==============================] - 25s 72ms/step - loss: 0.4456 - categorical_accuracy: 0.8656 - f1_score: 0.7913 - val_loss: 0.6011 - val_categorical_accuracy: 0.8279 - val_f1_score: 0.7430 - lr: 1.0000e-09 Epoch 23/40 348/348 [==============================] - 25s 71ms/step - loss: 0.4380 - categorical_accuracy: 0.8630 - f1_score: 0.7917 - val_loss: 0.4541 - val_categorical_accuracy: 0.8540 - val_f1_score: 0.7610 - lr: 1.0000e-09 Epoch 24/40 348/348 [==============================] - 25s 71ms/step - loss: 0.4271 - categorical_accuracy: 0.8693 - f1_score: 0.7986 - val_loss: 0.3713 - val_categorical_accuracy: 0.8730 - val_f1_score: 0.7853 - lr: 1.0000e-09 Epoch 25/40 348/348 [==============================] - 25s 72ms/step - loss: 0.4441 - categorical_accuracy: 0.8631 - f1_score: 0.7907 - val_loss: 0.3225 - val_categorical_accuracy: 0.8863 - val_f1_score: 0.8077 - lr: 1.0000e-09 Epoch 26/40 348/348 [==============================] - 25s 72ms/step - loss: 0.4423 - categorical_accuracy: 0.8606 - f1_score: 0.7871 - val_loss: 0.2945 - val_categorical_accuracy: 0.8940 - val_f1_score: 0.8168 - lr: 1.0000e-09 Epoch 27/40 348/348 [==============================] - 25s 72ms/step - loss: 0.4351 - categorical_accuracy: 0.8649 - f1_score: 0.7956 - val_loss: 0.2807 - val_categorical_accuracy: 0.8970 - val_f1_score: 0.8206 - lr: 1.0000e-09 Epoch 28/40 348/348 [==============================] - 25s 72ms/step - loss: 0.4426 - categorical_accuracy: 0.8661 - f1_score: 0.7926 - val_loss: 0.2732 - val_categorical_accuracy: 0.8970 - val_f1_score: 0.8207 - lr: 1.0000e-09 Epoch 29/40 348/348 [==============================] - 25s 71ms/step - loss: 0.4391 - categorical_accuracy: 0.8633 - f1_score: 0.7953 - val_loss: 0.2697 - val_categorical_accuracy: 0.8970 - val_f1_score: 0.8222 - lr: 1.0000e-09 Epoch 30/40 348/348 [==============================] - 25s 72ms/step - loss: 0.4356 - categorical_accuracy: 0.8666 - f1_score: 0.7937 - val_loss: 0.2680 - val_categorical_accuracy: 0.8950 - val_f1_score: 0.8174 - lr: 1.0000e-09 Epoch 31/40 348/348 [==============================] - 25s 71ms/step - loss: 0.4327 - categorical_accuracy: 0.8660 - f1_score: 0.7948 - val_loss: 0.2676 - val_categorical_accuracy: 0.8965 - val_f1_score: 0.8192 - lr: 1.0000e-09 Epoch 32/40 348/348 [==============================] - 25s 71ms/step - loss: 0.4444 - categorical_accuracy: 0.8647 - f1_score: 0.7912 - val_loss: 0.2672 - val_categorical_accuracy: 0.8970 - val_f1_score: 0.8196 - lr: 1.0000e-09 Epoch 33/40 348/348 [==============================] - 25s 71ms/step - loss: 0.4433 - categorical_accuracy: 0.8650 - f1_score: 0.7918 - val_loss: 0.2670 - val_categorical_accuracy: 0.8975 - val_f1_score: 0.8200 - lr: 1.0000e-09 Epoch 34/40 348/348 [==============================] - 25s 71ms/step - loss: 0.4367 - categorical_accuracy: 0.8650 - f1_score: 0.7955 - val_loss: 0.2670 - val_categorical_accuracy: 0.8965 - val_f1_score: 0.8190 - lr: 1.0000e-09 Epoch 35/40 348/348 [==============================] - 25s 71ms/step - loss: 0.4553 - categorical_accuracy: 0.8653 - f1_score: 0.7917 - val_loss: 0.2668 - val_categorical_accuracy: 0.8981 - val_f1_score: 0.8217 - lr: 1.0000e-09 Epoch 36/40 348/348 [==============================] - 25s 71ms/step - loss: 0.4448 - categorical_accuracy: 0.8647 - f1_score: 0.7934 - val_loss: 0.2670 - val_categorical_accuracy: 0.8981 - val_f1_score: 0.8214 - lr: 1.0000e-09 Epoch 37/40 348/348 [==============================] - 25s 71ms/step - loss: 0.4516 - categorical_accuracy: 0.8583 - f1_score: 0.7845 - val_loss: 0.2669 - val_categorical_accuracy: 0.8975 - val_f1_score: 0.8210 - lr: 1.0000e-09 Epoch 38/40 348/348 [==============================] - ETA: 0s - loss: 0.4364 - categorical_accuracy: 0.8656 - f1_score: 0.7918 Epoch 38: ReduceLROnPlateau reducing learning rate to 9.999999717180686e-11. 348/348 [==============================] - 25s 71ms/step - loss: 0.4364 - categorical_accuracy: 0.8656 - f1_score: 0.7918 - val_loss: 0.2673 - val_categorical_accuracy: 0.8970 - val_f1_score: 0.8203 - lr: 1.0000e-09 Epoch 39/40 348/348 [==============================] - 25s 71ms/step - loss: 0.4495 - categorical_accuracy: 0.8593 - f1_score: 0.7881 - val_loss: 0.2673 - val_categorical_accuracy: 0.8965 - val_f1_score: 0.8196 - lr: 1.0000e-10 Epoch 40/40 348/348 [==============================] - 25s 71ms/step - loss: 0.4433 - categorical_accuracy: 0.8624 - f1_score: 0.7897 - val_loss: 0.2674 - val_categorical_accuracy: 0.8965 - val_f1_score: 0.8196 - lr: 1.0000e-10
# Actual TGT classes distribution in validation set
# print("\n:::: Validation Set ====> ACTUAL TGT Classes Distribution ::::\n")
# display(val_tgt_classes_dist)
# Plotting the Final Results on Validation Set
print("\n:::: Validation Set ====> PREDICTION Confusion Matrix ::::\n")
mobile_net_v3_sm_global_tuning_val_results = confusion_matrix_(y_val, X_val, mobile_net_v3_sm)
# Displaying the overall performance results
print("\n:::: Validation Set ====> FINAL Results ::::\n")
display(mobile_net_v3_sm_global_tuning_val_results)
:::: Validation Set ====> PREDICTION Confusion Matrix :::: 62/62 [==============================] - 2s 24ms/step
:::: Validation Set ====> FINAL Results ::::
| Healthy | Multiple_Diseases | Rust | Scab | |
|---|---|---|---|---|
| BINARY Accuracy | 0.9619 | 0.9458 | 0.9496 | 0.9479 |
| Precision | 0.9181 | 0.7941 | 0.8780 | 0.8957 |
| Recall | 0.9548 | 0.9209 | 0.8999 | 0.8957 |
| Macro F1 Score | 0.9544 | 0.7293 | 0.9502 | 0.9340 |
| Macro ROC AUC Score | 0.9598 | 0.8282 | 0.9372 | 0.9284 |
OBSERVATIONS
# Actual TGT classes distribution in TEST set
# print("\n:::: TEST Set ====> ACTUAL TGT Classes Distribution ::::\n")
# display(val_tgt_classes_dist)
# Plotting the Results on TEST Set
print("\n:::: TEST Set ====> PREDICTION Confusion Matrix ::::\n")
mobile_net_v3_sm_global_tuning_test_results = confusion_matrix_(y_test, X_test, mobile_net_v3_sm)
# Displaying the overall performance results
print("\n:::: TEST Set ====> FINAL Results ::::\n")
display(mobile_net_v3_sm_global_tuning_test_results)
:::: TEST Set ====> PREDICTION Confusion Matrix :::: 12/12 [==============================] - 1s 50ms/step
:::: TEST Set ====> FINAL Results ::::
| Healthy | Multiple_Diseases | Rust | Scab | |
|---|---|---|---|---|
| BINARY Accuracy | 0.9096 | 0.9027 | 0.9142 | 0.9068 |
| Precision | 0.8365 | 0.6736 | 0.7815 | 0.8137 |
| Recall | 0.8447 | 0.8017 | 0.8577 | 0.8137 |
| Macro F1 Score | 0.8887 | 0.6441 | 0.9302 | 0.8612 |
| Macro ROC AUC Score | 0.8899 | 0.7346 | 0.9310 | 0.8431 |
OBSERVATIONS
Rusty, Scab and Healthy images.Multiple diseases class that has the lowest +ve cases, it is making some false positives and negatives. And, not competent enough for identifying the multiple diseased imagescurr_run_logdir4.split("/")[-1]
'run_2022_11_07-03_27_58'
notebook.list()
No known TensorBoard instances running.
%tensorboard --logdir logs

OBSERVATIONS
A5.EfficientNet---V2---B0¶# build the Efficient Net V2 B0 network
effnet_v2_b0_with_no_top_model = tf.keras.applications.efficientnet_v2.EfficientNetV2B0(include_top=False, weights='imagenet', input_shape=(224,224,3))
# Model summary
effnet_v2_b0_with_no_top_model.summary()
Model: "efficientnetv2-b0"
__________________________________________________________________________________________________
Layer (type) Output Shape Param # Connected to
==================================================================================================
input_1 (InputLayer) [(None, 224, 224, 3 0 []
)]
rescaling (Rescaling) (None, 224, 224, 3) 0 ['input_1[0][0]']
normalization (Normalization) (None, 224, 224, 3) 0 ['rescaling[0][0]']
stem_conv (Conv2D) (None, 112, 112, 32 864 ['normalization[0][0]']
)
stem_bn (BatchNormalization) (None, 112, 112, 32 128 ['stem_conv[0][0]']
)
stem_activation (Activation) (None, 112, 112, 32 0 ['stem_bn[0][0]']
)
block1a_project_conv (Conv2D) (None, 112, 112, 16 4608 ['stem_activation[0][0]']
)
block1a_project_bn (BatchNorma (None, 112, 112, 16 64 ['block1a_project_conv[0][0]']
lization) )
block1a_project_activation (Ac (None, 112, 112, 16 0 ['block1a_project_bn[0][0]']
tivation) )
block2a_expand_conv (Conv2D) (None, 56, 56, 64) 9216 ['block1a_project_activation[0][0
]']
block2a_expand_bn (BatchNormal (None, 56, 56, 64) 256 ['block2a_expand_conv[0][0]']
ization)
block2a_expand_activation (Act (None, 56, 56, 64) 0 ['block2a_expand_bn[0][0]']
ivation)
block2a_project_conv (Conv2D) (None, 56, 56, 32) 2048 ['block2a_expand_activation[0][0]
']
block2a_project_bn (BatchNorma (None, 56, 56, 32) 128 ['block2a_project_conv[0][0]']
lization)
block2b_expand_conv (Conv2D) (None, 56, 56, 128) 36864 ['block2a_project_bn[0][0]']
block2b_expand_bn (BatchNormal (None, 56, 56, 128) 512 ['block2b_expand_conv[0][0]']
ization)
block2b_expand_activation (Act (None, 56, 56, 128) 0 ['block2b_expand_bn[0][0]']
ivation)
block2b_project_conv (Conv2D) (None, 56, 56, 32) 4096 ['block2b_expand_activation[0][0]
']
block2b_project_bn (BatchNorma (None, 56, 56, 32) 128 ['block2b_project_conv[0][0]']
lization)
block2b_drop (Dropout) (None, 56, 56, 32) 0 ['block2b_project_bn[0][0]']
block2b_add (Add) (None, 56, 56, 32) 0 ['block2b_drop[0][0]',
'block2a_project_bn[0][0]']
block3a_expand_conv (Conv2D) (None, 28, 28, 128) 36864 ['block2b_add[0][0]']
block3a_expand_bn (BatchNormal (None, 28, 28, 128) 512 ['block3a_expand_conv[0][0]']
ization)
block3a_expand_activation (Act (None, 28, 28, 128) 0 ['block3a_expand_bn[0][0]']
ivation)
block3a_project_conv (Conv2D) (None, 28, 28, 48) 6144 ['block3a_expand_activation[0][0]
']
block3a_project_bn (BatchNorma (None, 28, 28, 48) 192 ['block3a_project_conv[0][0]']
lization)
block3b_expand_conv (Conv2D) (None, 28, 28, 192) 82944 ['block3a_project_bn[0][0]']
block3b_expand_bn (BatchNormal (None, 28, 28, 192) 768 ['block3b_expand_conv[0][0]']
ization)
block3b_expand_activation (Act (None, 28, 28, 192) 0 ['block3b_expand_bn[0][0]']
ivation)
block3b_project_conv (Conv2D) (None, 28, 28, 48) 9216 ['block3b_expand_activation[0][0]
']
block3b_project_bn (BatchNorma (None, 28, 28, 48) 192 ['block3b_project_conv[0][0]']
lization)
block3b_drop (Dropout) (None, 28, 28, 48) 0 ['block3b_project_bn[0][0]']
block3b_add (Add) (None, 28, 28, 48) 0 ['block3b_drop[0][0]',
'block3a_project_bn[0][0]']
block4a_expand_conv (Conv2D) (None, 28, 28, 192) 9216 ['block3b_add[0][0]']
block4a_expand_bn (BatchNormal (None, 28, 28, 192) 768 ['block4a_expand_conv[0][0]']
ization)
block4a_expand_activation (Act (None, 28, 28, 192) 0 ['block4a_expand_bn[0][0]']
ivation)
block4a_dwconv2 (DepthwiseConv (None, 14, 14, 192) 1728 ['block4a_expand_activation[0][0]
2D) ']
block4a_bn (BatchNormalization (None, 14, 14, 192) 768 ['block4a_dwconv2[0][0]']
)
block4a_activation (Activation (None, 14, 14, 192) 0 ['block4a_bn[0][0]']
)
block4a_se_squeeze (GlobalAver (None, 192) 0 ['block4a_activation[0][0]']
agePooling2D)
block4a_se_reshape (Reshape) (None, 1, 1, 192) 0 ['block4a_se_squeeze[0][0]']
block4a_se_reduce (Conv2D) (None, 1, 1, 12) 2316 ['block4a_se_reshape[0][0]']
block4a_se_expand (Conv2D) (None, 1, 1, 192) 2496 ['block4a_se_reduce[0][0]']
block4a_se_excite (Multiply) (None, 14, 14, 192) 0 ['block4a_activation[0][0]',
'block4a_se_expand[0][0]']
block4a_project_conv (Conv2D) (None, 14, 14, 96) 18432 ['block4a_se_excite[0][0]']
block4a_project_bn (BatchNorma (None, 14, 14, 96) 384 ['block4a_project_conv[0][0]']
lization)
block4b_expand_conv (Conv2D) (None, 14, 14, 384) 36864 ['block4a_project_bn[0][0]']
block4b_expand_bn (BatchNormal (None, 14, 14, 384) 1536 ['block4b_expand_conv[0][0]']
ization)
block4b_expand_activation (Act (None, 14, 14, 384) 0 ['block4b_expand_bn[0][0]']
ivation)
block4b_dwconv2 (DepthwiseConv (None, 14, 14, 384) 3456 ['block4b_expand_activation[0][0]
2D) ']
block4b_bn (BatchNormalization (None, 14, 14, 384) 1536 ['block4b_dwconv2[0][0]']
)
block4b_activation (Activation (None, 14, 14, 384) 0 ['block4b_bn[0][0]']
)
block4b_se_squeeze (GlobalAver (None, 384) 0 ['block4b_activation[0][0]']
agePooling2D)
block4b_se_reshape (Reshape) (None, 1, 1, 384) 0 ['block4b_se_squeeze[0][0]']
block4b_se_reduce (Conv2D) (None, 1, 1, 24) 9240 ['block4b_se_reshape[0][0]']
block4b_se_expand (Conv2D) (None, 1, 1, 384) 9600 ['block4b_se_reduce[0][0]']
block4b_se_excite (Multiply) (None, 14, 14, 384) 0 ['block4b_activation[0][0]',
'block4b_se_expand[0][0]']
block4b_project_conv (Conv2D) (None, 14, 14, 96) 36864 ['block4b_se_excite[0][0]']
block4b_project_bn (BatchNorma (None, 14, 14, 96) 384 ['block4b_project_conv[0][0]']
lization)
block4b_drop (Dropout) (None, 14, 14, 96) 0 ['block4b_project_bn[0][0]']
block4b_add (Add) (None, 14, 14, 96) 0 ['block4b_drop[0][0]',
'block4a_project_bn[0][0]']
block4c_expand_conv (Conv2D) (None, 14, 14, 384) 36864 ['block4b_add[0][0]']
block4c_expand_bn (BatchNormal (None, 14, 14, 384) 1536 ['block4c_expand_conv[0][0]']
ization)
block4c_expand_activation (Act (None, 14, 14, 384) 0 ['block4c_expand_bn[0][0]']
ivation)
block4c_dwconv2 (DepthwiseConv (None, 14, 14, 384) 3456 ['block4c_expand_activation[0][0]
2D) ']
block4c_bn (BatchNormalization (None, 14, 14, 384) 1536 ['block4c_dwconv2[0][0]']
)
block4c_activation (Activation (None, 14, 14, 384) 0 ['block4c_bn[0][0]']
)
block4c_se_squeeze (GlobalAver (None, 384) 0 ['block4c_activation[0][0]']
agePooling2D)
block4c_se_reshape (Reshape) (None, 1, 1, 384) 0 ['block4c_se_squeeze[0][0]']
block4c_se_reduce (Conv2D) (None, 1, 1, 24) 9240 ['block4c_se_reshape[0][0]']
block4c_se_expand (Conv2D) (None, 1, 1, 384) 9600 ['block4c_se_reduce[0][0]']
block4c_se_excite (Multiply) (None, 14, 14, 384) 0 ['block4c_activation[0][0]',
'block4c_se_expand[0][0]']
block4c_project_conv (Conv2D) (None, 14, 14, 96) 36864 ['block4c_se_excite[0][0]']
block4c_project_bn (BatchNorma (None, 14, 14, 96) 384 ['block4c_project_conv[0][0]']
lization)
block4c_drop (Dropout) (None, 14, 14, 96) 0 ['block4c_project_bn[0][0]']
block4c_add (Add) (None, 14, 14, 96) 0 ['block4c_drop[0][0]',
'block4b_add[0][0]']
block5a_expand_conv (Conv2D) (None, 14, 14, 576) 55296 ['block4c_add[0][0]']
block5a_expand_bn (BatchNormal (None, 14, 14, 576) 2304 ['block5a_expand_conv[0][0]']
ization)
block5a_expand_activation (Act (None, 14, 14, 576) 0 ['block5a_expand_bn[0][0]']
ivation)
block5a_dwconv2 (DepthwiseConv (None, 14, 14, 576) 5184 ['block5a_expand_activation[0][0]
2D) ']
block5a_bn (BatchNormalization (None, 14, 14, 576) 2304 ['block5a_dwconv2[0][0]']
)
block5a_activation (Activation (None, 14, 14, 576) 0 ['block5a_bn[0][0]']
)
block5a_se_squeeze (GlobalAver (None, 576) 0 ['block5a_activation[0][0]']
agePooling2D)
block5a_se_reshape (Reshape) (None, 1, 1, 576) 0 ['block5a_se_squeeze[0][0]']
block5a_se_reduce (Conv2D) (None, 1, 1, 24) 13848 ['block5a_se_reshape[0][0]']
block5a_se_expand (Conv2D) (None, 1, 1, 576) 14400 ['block5a_se_reduce[0][0]']
block5a_se_excite (Multiply) (None, 14, 14, 576) 0 ['block5a_activation[0][0]',
'block5a_se_expand[0][0]']
block5a_project_conv (Conv2D) (None, 14, 14, 112) 64512 ['block5a_se_excite[0][0]']
block5a_project_bn (BatchNorma (None, 14, 14, 112) 448 ['block5a_project_conv[0][0]']
lization)
block5b_expand_conv (Conv2D) (None, 14, 14, 672) 75264 ['block5a_project_bn[0][0]']
block5b_expand_bn (BatchNormal (None, 14, 14, 672) 2688 ['block5b_expand_conv[0][0]']
ization)
block5b_expand_activation (Act (None, 14, 14, 672) 0 ['block5b_expand_bn[0][0]']
ivation)
block5b_dwconv2 (DepthwiseConv (None, 14, 14, 672) 6048 ['block5b_expand_activation[0][0]
2D) ']
block5b_bn (BatchNormalization (None, 14, 14, 672) 2688 ['block5b_dwconv2[0][0]']
)
block5b_activation (Activation (None, 14, 14, 672) 0 ['block5b_bn[0][0]']
)
block5b_se_squeeze (GlobalAver (None, 672) 0 ['block5b_activation[0][0]']
agePooling2D)
block5b_se_reshape (Reshape) (None, 1, 1, 672) 0 ['block5b_se_squeeze[0][0]']
block5b_se_reduce (Conv2D) (None, 1, 1, 28) 18844 ['block5b_se_reshape[0][0]']
block5b_se_expand (Conv2D) (None, 1, 1, 672) 19488 ['block5b_se_reduce[0][0]']
block5b_se_excite (Multiply) (None, 14, 14, 672) 0 ['block5b_activation[0][0]',
'block5b_se_expand[0][0]']
block5b_project_conv (Conv2D) (None, 14, 14, 112) 75264 ['block5b_se_excite[0][0]']
block5b_project_bn (BatchNorma (None, 14, 14, 112) 448 ['block5b_project_conv[0][0]']
lization)
block5b_drop (Dropout) (None, 14, 14, 112) 0 ['block5b_project_bn[0][0]']
block5b_add (Add) (None, 14, 14, 112) 0 ['block5b_drop[0][0]',
'block5a_project_bn[0][0]']
block5c_expand_conv (Conv2D) (None, 14, 14, 672) 75264 ['block5b_add[0][0]']
block5c_expand_bn (BatchNormal (None, 14, 14, 672) 2688 ['block5c_expand_conv[0][0]']
ization)
block5c_expand_activation (Act (None, 14, 14, 672) 0 ['block5c_expand_bn[0][0]']
ivation)
block5c_dwconv2 (DepthwiseConv (None, 14, 14, 672) 6048 ['block5c_expand_activation[0][0]
2D) ']
block5c_bn (BatchNormalization (None, 14, 14, 672) 2688 ['block5c_dwconv2[0][0]']
)
block5c_activation (Activation (None, 14, 14, 672) 0 ['block5c_bn[0][0]']
)
block5c_se_squeeze (GlobalAver (None, 672) 0 ['block5c_activation[0][0]']
agePooling2D)
block5c_se_reshape (Reshape) (None, 1, 1, 672) 0 ['block5c_se_squeeze[0][0]']
block5c_se_reduce (Conv2D) (None, 1, 1, 28) 18844 ['block5c_se_reshape[0][0]']
block5c_se_expand (Conv2D) (None, 1, 1, 672) 19488 ['block5c_se_reduce[0][0]']
block5c_se_excite (Multiply) (None, 14, 14, 672) 0 ['block5c_activation[0][0]',
'block5c_se_expand[0][0]']
block5c_project_conv (Conv2D) (None, 14, 14, 112) 75264 ['block5c_se_excite[0][0]']
block5c_project_bn (BatchNorma (None, 14, 14, 112) 448 ['block5c_project_conv[0][0]']
lization)
block5c_drop (Dropout) (None, 14, 14, 112) 0 ['block5c_project_bn[0][0]']
block5c_add (Add) (None, 14, 14, 112) 0 ['block5c_drop[0][0]',
'block5b_add[0][0]']
block5d_expand_conv (Conv2D) (None, 14, 14, 672) 75264 ['block5c_add[0][0]']
block5d_expand_bn (BatchNormal (None, 14, 14, 672) 2688 ['block5d_expand_conv[0][0]']
ization)
block5d_expand_activation (Act (None, 14, 14, 672) 0 ['block5d_expand_bn[0][0]']
ivation)
block5d_dwconv2 (DepthwiseConv (None, 14, 14, 672) 6048 ['block5d_expand_activation[0][0]
2D) ']
block5d_bn (BatchNormalization (None, 14, 14, 672) 2688 ['block5d_dwconv2[0][0]']
)
block5d_activation (Activation (None, 14, 14, 672) 0 ['block5d_bn[0][0]']
)
block5d_se_squeeze (GlobalAver (None, 672) 0 ['block5d_activation[0][0]']
agePooling2D)
block5d_se_reshape (Reshape) (None, 1, 1, 672) 0 ['block5d_se_squeeze[0][0]']
block5d_se_reduce (Conv2D) (None, 1, 1, 28) 18844 ['block5d_se_reshape[0][0]']
block5d_se_expand (Conv2D) (None, 1, 1, 672) 19488 ['block5d_se_reduce[0][0]']
block5d_se_excite (Multiply) (None, 14, 14, 672) 0 ['block5d_activation[0][0]',
'block5d_se_expand[0][0]']
block5d_project_conv (Conv2D) (None, 14, 14, 112) 75264 ['block5d_se_excite[0][0]']
block5d_project_bn (BatchNorma (None, 14, 14, 112) 448 ['block5d_project_conv[0][0]']
lization)
block5d_drop (Dropout) (None, 14, 14, 112) 0 ['block5d_project_bn[0][0]']
block5d_add (Add) (None, 14, 14, 112) 0 ['block5d_drop[0][0]',
'block5c_add[0][0]']
block5e_expand_conv (Conv2D) (None, 14, 14, 672) 75264 ['block5d_add[0][0]']
block5e_expand_bn (BatchNormal (None, 14, 14, 672) 2688 ['block5e_expand_conv[0][0]']
ization)
block5e_expand_activation (Act (None, 14, 14, 672) 0 ['block5e_expand_bn[0][0]']
ivation)
block5e_dwconv2 (DepthwiseConv (None, 14, 14, 672) 6048 ['block5e_expand_activation[0][0]
2D) ']
block5e_bn (BatchNormalization (None, 14, 14, 672) 2688 ['block5e_dwconv2[0][0]']
)
block5e_activation (Activation (None, 14, 14, 672) 0 ['block5e_bn[0][0]']
)
block5e_se_squeeze (GlobalAver (None, 672) 0 ['block5e_activation[0][0]']
agePooling2D)
block5e_se_reshape (Reshape) (None, 1, 1, 672) 0 ['block5e_se_squeeze[0][0]']
block5e_se_reduce (Conv2D) (None, 1, 1, 28) 18844 ['block5e_se_reshape[0][0]']
block5e_se_expand (Conv2D) (None, 1, 1, 672) 19488 ['block5e_se_reduce[0][0]']
block5e_se_excite (Multiply) (None, 14, 14, 672) 0 ['block5e_activation[0][0]',
'block5e_se_expand[0][0]']
block5e_project_conv (Conv2D) (None, 14, 14, 112) 75264 ['block5e_se_excite[0][0]']
block5e_project_bn (BatchNorma (None, 14, 14, 112) 448 ['block5e_project_conv[0][0]']
lization)
block5e_drop (Dropout) (None, 14, 14, 112) 0 ['block5e_project_bn[0][0]']
block5e_add (Add) (None, 14, 14, 112) 0 ['block5e_drop[0][0]',
'block5d_add[0][0]']
block6a_expand_conv (Conv2D) (None, 14, 14, 672) 75264 ['block5e_add[0][0]']
block6a_expand_bn (BatchNormal (None, 14, 14, 672) 2688 ['block6a_expand_conv[0][0]']
ization)
block6a_expand_activation (Act (None, 14, 14, 672) 0 ['block6a_expand_bn[0][0]']
ivation)
block6a_dwconv2 (DepthwiseConv (None, 7, 7, 672) 6048 ['block6a_expand_activation[0][0]
2D) ']
block6a_bn (BatchNormalization (None, 7, 7, 672) 2688 ['block6a_dwconv2[0][0]']
)
block6a_activation (Activation (None, 7, 7, 672) 0 ['block6a_bn[0][0]']
)
block6a_se_squeeze (GlobalAver (None, 672) 0 ['block6a_activation[0][0]']
agePooling2D)
block6a_se_reshape (Reshape) (None, 1, 1, 672) 0 ['block6a_se_squeeze[0][0]']
block6a_se_reduce (Conv2D) (None, 1, 1, 28) 18844 ['block6a_se_reshape[0][0]']
block6a_se_expand (Conv2D) (None, 1, 1, 672) 19488 ['block6a_se_reduce[0][0]']
block6a_se_excite (Multiply) (None, 7, 7, 672) 0 ['block6a_activation[0][0]',
'block6a_se_expand[0][0]']
block6a_project_conv (Conv2D) (None, 7, 7, 192) 129024 ['block6a_se_excite[0][0]']
block6a_project_bn (BatchNorma (None, 7, 7, 192) 768 ['block6a_project_conv[0][0]']
lization)
block6b_expand_conv (Conv2D) (None, 7, 7, 1152) 221184 ['block6a_project_bn[0][0]']
block6b_expand_bn (BatchNormal (None, 7, 7, 1152) 4608 ['block6b_expand_conv[0][0]']
ization)
block6b_expand_activation (Act (None, 7, 7, 1152) 0 ['block6b_expand_bn[0][0]']
ivation)
block6b_dwconv2 (DepthwiseConv (None, 7, 7, 1152) 10368 ['block6b_expand_activation[0][0]
2D) ']
block6b_bn (BatchNormalization (None, 7, 7, 1152) 4608 ['block6b_dwconv2[0][0]']
)
block6b_activation (Activation (None, 7, 7, 1152) 0 ['block6b_bn[0][0]']
)
block6b_se_squeeze (GlobalAver (None, 1152) 0 ['block6b_activation[0][0]']
agePooling2D)
block6b_se_reshape (Reshape) (None, 1, 1, 1152) 0 ['block6b_se_squeeze[0][0]']
block6b_se_reduce (Conv2D) (None, 1, 1, 48) 55344 ['block6b_se_reshape[0][0]']
block6b_se_expand (Conv2D) (None, 1, 1, 1152) 56448 ['block6b_se_reduce[0][0]']
block6b_se_excite (Multiply) (None, 7, 7, 1152) 0 ['block6b_activation[0][0]',
'block6b_se_expand[0][0]']
block6b_project_conv (Conv2D) (None, 7, 7, 192) 221184 ['block6b_se_excite[0][0]']
block6b_project_bn (BatchNorma (None, 7, 7, 192) 768 ['block6b_project_conv[0][0]']
lization)
block6b_drop (Dropout) (None, 7, 7, 192) 0 ['block6b_project_bn[0][0]']
block6b_add (Add) (None, 7, 7, 192) 0 ['block6b_drop[0][0]',
'block6a_project_bn[0][0]']
block6c_expand_conv (Conv2D) (None, 7, 7, 1152) 221184 ['block6b_add[0][0]']
block6c_expand_bn (BatchNormal (None, 7, 7, 1152) 4608 ['block6c_expand_conv[0][0]']
ization)
block6c_expand_activation (Act (None, 7, 7, 1152) 0 ['block6c_expand_bn[0][0]']
ivation)
block6c_dwconv2 (DepthwiseConv (None, 7, 7, 1152) 10368 ['block6c_expand_activation[0][0]
2D) ']
block6c_bn (BatchNormalization (None, 7, 7, 1152) 4608 ['block6c_dwconv2[0][0]']
)
block6c_activation (Activation (None, 7, 7, 1152) 0 ['block6c_bn[0][0]']
)
block6c_se_squeeze (GlobalAver (None, 1152) 0 ['block6c_activation[0][0]']
agePooling2D)
block6c_se_reshape (Reshape) (None, 1, 1, 1152) 0 ['block6c_se_squeeze[0][0]']
block6c_se_reduce (Conv2D) (None, 1, 1, 48) 55344 ['block6c_se_reshape[0][0]']
block6c_se_expand (Conv2D) (None, 1, 1, 1152) 56448 ['block6c_se_reduce[0][0]']
block6c_se_excite (Multiply) (None, 7, 7, 1152) 0 ['block6c_activation[0][0]',
'block6c_se_expand[0][0]']
block6c_project_conv (Conv2D) (None, 7, 7, 192) 221184 ['block6c_se_excite[0][0]']
block6c_project_bn (BatchNorma (None, 7, 7, 192) 768 ['block6c_project_conv[0][0]']
lization)
block6c_drop (Dropout) (None, 7, 7, 192) 0 ['block6c_project_bn[0][0]']
block6c_add (Add) (None, 7, 7, 192) 0 ['block6c_drop[0][0]',
'block6b_add[0][0]']
block6d_expand_conv (Conv2D) (None, 7, 7, 1152) 221184 ['block6c_add[0][0]']
block6d_expand_bn (BatchNormal (None, 7, 7, 1152) 4608 ['block6d_expand_conv[0][0]']
ization)
block6d_expand_activation (Act (None, 7, 7, 1152) 0 ['block6d_expand_bn[0][0]']
ivation)
block6d_dwconv2 (DepthwiseConv (None, 7, 7, 1152) 10368 ['block6d_expand_activation[0][0]
2D) ']
block6d_bn (BatchNormalization (None, 7, 7, 1152) 4608 ['block6d_dwconv2[0][0]']
)
block6d_activation (Activation (None, 7, 7, 1152) 0 ['block6d_bn[0][0]']
)
block6d_se_squeeze (GlobalAver (None, 1152) 0 ['block6d_activation[0][0]']
agePooling2D)
block6d_se_reshape (Reshape) (None, 1, 1, 1152) 0 ['block6d_se_squeeze[0][0]']
block6d_se_reduce (Conv2D) (None, 1, 1, 48) 55344 ['block6d_se_reshape[0][0]']
block6d_se_expand (Conv2D) (None, 1, 1, 1152) 56448 ['block6d_se_reduce[0][0]']
block6d_se_excite (Multiply) (None, 7, 7, 1152) 0 ['block6d_activation[0][0]',
'block6d_se_expand[0][0]']
block6d_project_conv (Conv2D) (None, 7, 7, 192) 221184 ['block6d_se_excite[0][0]']
block6d_project_bn (BatchNorma (None, 7, 7, 192) 768 ['block6d_project_conv[0][0]']
lization)
block6d_drop (Dropout) (None, 7, 7, 192) 0 ['block6d_project_bn[0][0]']
block6d_add (Add) (None, 7, 7, 192) 0 ['block6d_drop[0][0]',
'block6c_add[0][0]']
block6e_expand_conv (Conv2D) (None, 7, 7, 1152) 221184 ['block6d_add[0][0]']
block6e_expand_bn (BatchNormal (None, 7, 7, 1152) 4608 ['block6e_expand_conv[0][0]']
ization)
block6e_expand_activation (Act (None, 7, 7, 1152) 0 ['block6e_expand_bn[0][0]']
ivation)
block6e_dwconv2 (DepthwiseConv (None, 7, 7, 1152) 10368 ['block6e_expand_activation[0][0]
2D) ']
block6e_bn (BatchNormalization (None, 7, 7, 1152) 4608 ['block6e_dwconv2[0][0]']
)
block6e_activation (Activation (None, 7, 7, 1152) 0 ['block6e_bn[0][0]']
)
block6e_se_squeeze (GlobalAver (None, 1152) 0 ['block6e_activation[0][0]']
agePooling2D)
block6e_se_reshape (Reshape) (None, 1, 1, 1152) 0 ['block6e_se_squeeze[0][0]']
block6e_se_reduce (Conv2D) (None, 1, 1, 48) 55344 ['block6e_se_reshape[0][0]']
block6e_se_expand (Conv2D) (None, 1, 1, 1152) 56448 ['block6e_se_reduce[0][0]']
block6e_se_excite (Multiply) (None, 7, 7, 1152) 0 ['block6e_activation[0][0]',
'block6e_se_expand[0][0]']
block6e_project_conv (Conv2D) (None, 7, 7, 192) 221184 ['block6e_se_excite[0][0]']
block6e_project_bn (BatchNorma (None, 7, 7, 192) 768 ['block6e_project_conv[0][0]']
lization)
block6e_drop (Dropout) (None, 7, 7, 192) 0 ['block6e_project_bn[0][0]']
block6e_add (Add) (None, 7, 7, 192) 0 ['block6e_drop[0][0]',
'block6d_add[0][0]']
block6f_expand_conv (Conv2D) (None, 7, 7, 1152) 221184 ['block6e_add[0][0]']
block6f_expand_bn (BatchNormal (None, 7, 7, 1152) 4608 ['block6f_expand_conv[0][0]']
ization)
block6f_expand_activation (Act (None, 7, 7, 1152) 0 ['block6f_expand_bn[0][0]']
ivation)
block6f_dwconv2 (DepthwiseConv (None, 7, 7, 1152) 10368 ['block6f_expand_activation[0][0]
2D) ']
block6f_bn (BatchNormalization (None, 7, 7, 1152) 4608 ['block6f_dwconv2[0][0]']
)
block6f_activation (Activation (None, 7, 7, 1152) 0 ['block6f_bn[0][0]']
)
block6f_se_squeeze (GlobalAver (None, 1152) 0 ['block6f_activation[0][0]']
agePooling2D)
block6f_se_reshape (Reshape) (None, 1, 1, 1152) 0 ['block6f_se_squeeze[0][0]']
block6f_se_reduce (Conv2D) (None, 1, 1, 48) 55344 ['block6f_se_reshape[0][0]']
block6f_se_expand (Conv2D) (None, 1, 1, 1152) 56448 ['block6f_se_reduce[0][0]']
block6f_se_excite (Multiply) (None, 7, 7, 1152) 0 ['block6f_activation[0][0]',
'block6f_se_expand[0][0]']
block6f_project_conv (Conv2D) (None, 7, 7, 192) 221184 ['block6f_se_excite[0][0]']
block6f_project_bn (BatchNorma (None, 7, 7, 192) 768 ['block6f_project_conv[0][0]']
lization)
block6f_drop (Dropout) (None, 7, 7, 192) 0 ['block6f_project_bn[0][0]']
block6f_add (Add) (None, 7, 7, 192) 0 ['block6f_drop[0][0]',
'block6e_add[0][0]']
block6g_expand_conv (Conv2D) (None, 7, 7, 1152) 221184 ['block6f_add[0][0]']
block6g_expand_bn (BatchNormal (None, 7, 7, 1152) 4608 ['block6g_expand_conv[0][0]']
ization)
block6g_expand_activation (Act (None, 7, 7, 1152) 0 ['block6g_expand_bn[0][0]']
ivation)
block6g_dwconv2 (DepthwiseConv (None, 7, 7, 1152) 10368 ['block6g_expand_activation[0][0]
2D) ']
block6g_bn (BatchNormalization (None, 7, 7, 1152) 4608 ['block6g_dwconv2[0][0]']
)
block6g_activation (Activation (None, 7, 7, 1152) 0 ['block6g_bn[0][0]']
)
block6g_se_squeeze (GlobalAver (None, 1152) 0 ['block6g_activation[0][0]']
agePooling2D)
block6g_se_reshape (Reshape) (None, 1, 1, 1152) 0 ['block6g_se_squeeze[0][0]']
block6g_se_reduce (Conv2D) (None, 1, 1, 48) 55344 ['block6g_se_reshape[0][0]']
block6g_se_expand (Conv2D) (None, 1, 1, 1152) 56448 ['block6g_se_reduce[0][0]']
block6g_se_excite (Multiply) (None, 7, 7, 1152) 0 ['block6g_activation[0][0]',
'block6g_se_expand[0][0]']
block6g_project_conv (Conv2D) (None, 7, 7, 192) 221184 ['block6g_se_excite[0][0]']
block6g_project_bn (BatchNorma (None, 7, 7, 192) 768 ['block6g_project_conv[0][0]']
lization)
block6g_drop (Dropout) (None, 7, 7, 192) 0 ['block6g_project_bn[0][0]']
block6g_add (Add) (None, 7, 7, 192) 0 ['block6g_drop[0][0]',
'block6f_add[0][0]']
block6h_expand_conv (Conv2D) (None, 7, 7, 1152) 221184 ['block6g_add[0][0]']
block6h_expand_bn (BatchNormal (None, 7, 7, 1152) 4608 ['block6h_expand_conv[0][0]']
ization)
block6h_expand_activation (Act (None, 7, 7, 1152) 0 ['block6h_expand_bn[0][0]']
ivation)
block6h_dwconv2 (DepthwiseConv (None, 7, 7, 1152) 10368 ['block6h_expand_activation[0][0]
2D) ']
block6h_bn (BatchNormalization (None, 7, 7, 1152) 4608 ['block6h_dwconv2[0][0]']
)
block6h_activation (Activation (None, 7, 7, 1152) 0 ['block6h_bn[0][0]']
)
block6h_se_squeeze (GlobalAver (None, 1152) 0 ['block6h_activation[0][0]']
agePooling2D)
block6h_se_reshape (Reshape) (None, 1, 1, 1152) 0 ['block6h_se_squeeze[0][0]']
block6h_se_reduce (Conv2D) (None, 1, 1, 48) 55344 ['block6h_se_reshape[0][0]']
block6h_se_expand (Conv2D) (None, 1, 1, 1152) 56448 ['block6h_se_reduce[0][0]']
block6h_se_excite (Multiply) (None, 7, 7, 1152) 0 ['block6h_activation[0][0]',
'block6h_se_expand[0][0]']
block6h_project_conv (Conv2D) (None, 7, 7, 192) 221184 ['block6h_se_excite[0][0]']
block6h_project_bn (BatchNorma (None, 7, 7, 192) 768 ['block6h_project_conv[0][0]']
lization)
block6h_drop (Dropout) (None, 7, 7, 192) 0 ['block6h_project_bn[0][0]']
block6h_add (Add) (None, 7, 7, 192) 0 ['block6h_drop[0][0]',
'block6g_add[0][0]']
top_conv (Conv2D) (None, 7, 7, 1280) 245760 ['block6h_add[0][0]']
top_bn (BatchNormalization) (None, 7, 7, 1280) 5120 ['top_conv[0][0]']
top_activation (Activation) (None, 7, 7, 1280) 0 ['top_bn[0][0]']
==================================================================================================
Total params: 5,919,312
Trainable params: 5,858,704
Non-trainable params: 60,608
__________________________________________________________________________________________________
# Instantiating Optimizer
learning_rate= 0.0001
opt5 = tf.keras.optimizers.Adam(learning_rate=learning_rate)
# Reduce Learning Rate on Plateau
reduce_lr5 = tf.keras.callbacks.ReduceLROnPlateau(monitor='val_loss', factor=0.1, patience=2, verbose=1, mode='auto', min_delta=0.0001)
# Logs directory
curr_run_logdir5 = get_run_logdir()
# Instantiating Tensorboard callback
tensorboard_callback5 = TensorBoard(log_dir=curr_run_logdir5, histogram_freq=1)
# Setting the seed
os.environ['PYTHONHASHSEED'] = '0'
# Clearing the TF session
tf.keras.backend.clear_session()
# defining the custom top of the Efficient Net model
for layer in effnet_v2_b0_with_no_top_model.layers:
layer.trainable = True
# Adding additional layers
input_layer = effnet_v2_b0_with_no_top_model.output
# Defining the top layers structure of the model
flatten = tf.keras.layers.GlobalAveragePooling2D(name='Flatten_for_hidden_layers')(input_layer)
dropout_1 = Dropout(rate=0.5, name='Dropout1')(flatten)
dense_layer1 = tf.keras.layers.Dense(units=128,
activation='relu',
use_bias=True,
kernel_initializer=tf.keras.initializers.he_normal(seed=80),
bias_initializer=tf.keras.initializers.he_normal(seed=110),
name='Hidden_Layer1')(dropout_1)
dropout_2 = Dropout(rate=0.5, name='Dropout2')(dense_layer1)
dense_layer2 = tf.keras.layers.Dense(units=128,
activation='relu',
use_bias=True,
kernel_initializer=tf.keras.initializers.he_normal(seed=80),
bias_initializer=tf.keras.initializers.he_normal(seed=110),
name='Hidden_Layer2')(dropout_2)
dropout_3 = Dropout(rate=0.5, name='Dropout3')(dense_layer2)
dense_layer3 = tf.keras.layers.Dense(units=128,
activation='relu',
use_bias=True,
kernel_initializer=tf.keras.initializers.he_normal(seed=80),
bias_initializer=tf.keras.initializers.he_normal(seed=110),
name='Hidden_Layer3')(dropout_3)
output_layer = tf.keras.layers.Dense(4, activation='softmax', name="output")(dense_layer3)
# Instantiating the complete model
eff_net_v2_b0 = Model(inputs=effnet_v2_b0_with_no_top_model.input, outputs=output_layer)
# Compiling the model
eff_net_v2_b0.compile(optimizer=opt5,
loss = 'categorical_crossentropy',
metrics=['categorical_accuracy', tfa_f1_scr])
# Summary of the MobileNet V3 Small model with custom top
eff_net_v2_b0.summary()
Model: "model"
__________________________________________________________________________________________________
Layer (type) Output Shape Param # Connected to
==================================================================================================
input_1 (InputLayer) [(None, 224, 224, 3 0 []
)]
rescaling (Rescaling) (None, 224, 224, 3) 0 ['input_1[0][0]']
normalization (Normalization) (None, 224, 224, 3) 0 ['rescaling[0][0]']
stem_conv (Conv2D) (None, 112, 112, 32 864 ['normalization[0][0]']
)
stem_bn (BatchNormalization) (None, 112, 112, 32 128 ['stem_conv[0][0]']
)
stem_activation (Activation) (None, 112, 112, 32 0 ['stem_bn[0][0]']
)
block1a_project_conv (Conv2D) (None, 112, 112, 16 4608 ['stem_activation[0][0]']
)
block1a_project_bn (BatchNorma (None, 112, 112, 16 64 ['block1a_project_conv[0][0]']
lization) )
block1a_project_activation (Ac (None, 112, 112, 16 0 ['block1a_project_bn[0][0]']
tivation) )
block2a_expand_conv (Conv2D) (None, 56, 56, 64) 9216 ['block1a_project_activation[0][0
]']
block2a_expand_bn (BatchNormal (None, 56, 56, 64) 256 ['block2a_expand_conv[0][0]']
ization)
block2a_expand_activation (Act (None, 56, 56, 64) 0 ['block2a_expand_bn[0][0]']
ivation)
block2a_project_conv (Conv2D) (None, 56, 56, 32) 2048 ['block2a_expand_activation[0][0]
']
block2a_project_bn (BatchNorma (None, 56, 56, 32) 128 ['block2a_project_conv[0][0]']
lization)
block2b_expand_conv (Conv2D) (None, 56, 56, 128) 36864 ['block2a_project_bn[0][0]']
block2b_expand_bn (BatchNormal (None, 56, 56, 128) 512 ['block2b_expand_conv[0][0]']
ization)
block2b_expand_activation (Act (None, 56, 56, 128) 0 ['block2b_expand_bn[0][0]']
ivation)
block2b_project_conv (Conv2D) (None, 56, 56, 32) 4096 ['block2b_expand_activation[0][0]
']
block2b_project_bn (BatchNorma (None, 56, 56, 32) 128 ['block2b_project_conv[0][0]']
lization)
block2b_drop (Dropout) (None, 56, 56, 32) 0 ['block2b_project_bn[0][0]']
block2b_add (Add) (None, 56, 56, 32) 0 ['block2b_drop[0][0]',
'block2a_project_bn[0][0]']
block3a_expand_conv (Conv2D) (None, 28, 28, 128) 36864 ['block2b_add[0][0]']
block3a_expand_bn (BatchNormal (None, 28, 28, 128) 512 ['block3a_expand_conv[0][0]']
ization)
block3a_expand_activation (Act (None, 28, 28, 128) 0 ['block3a_expand_bn[0][0]']
ivation)
block3a_project_conv (Conv2D) (None, 28, 28, 48) 6144 ['block3a_expand_activation[0][0]
']
block3a_project_bn (BatchNorma (None, 28, 28, 48) 192 ['block3a_project_conv[0][0]']
lization)
block3b_expand_conv (Conv2D) (None, 28, 28, 192) 82944 ['block3a_project_bn[0][0]']
block3b_expand_bn (BatchNormal (None, 28, 28, 192) 768 ['block3b_expand_conv[0][0]']
ization)
block3b_expand_activation (Act (None, 28, 28, 192) 0 ['block3b_expand_bn[0][0]']
ivation)
block3b_project_conv (Conv2D) (None, 28, 28, 48) 9216 ['block3b_expand_activation[0][0]
']
block3b_project_bn (BatchNorma (None, 28, 28, 48) 192 ['block3b_project_conv[0][0]']
lization)
block3b_drop (Dropout) (None, 28, 28, 48) 0 ['block3b_project_bn[0][0]']
block3b_add (Add) (None, 28, 28, 48) 0 ['block3b_drop[0][0]',
'block3a_project_bn[0][0]']
block4a_expand_conv (Conv2D) (None, 28, 28, 192) 9216 ['block3b_add[0][0]']
block4a_expand_bn (BatchNormal (None, 28, 28, 192) 768 ['block4a_expand_conv[0][0]']
ization)
block4a_expand_activation (Act (None, 28, 28, 192) 0 ['block4a_expand_bn[0][0]']
ivation)
block4a_dwconv2 (DepthwiseConv (None, 14, 14, 192) 1728 ['block4a_expand_activation[0][0]
2D) ']
block4a_bn (BatchNormalization (None, 14, 14, 192) 768 ['block4a_dwconv2[0][0]']
)
block4a_activation (Activation (None, 14, 14, 192) 0 ['block4a_bn[0][0]']
)
block4a_se_squeeze (GlobalAver (None, 192) 0 ['block4a_activation[0][0]']
agePooling2D)
block4a_se_reshape (Reshape) (None, 1, 1, 192) 0 ['block4a_se_squeeze[0][0]']
block4a_se_reduce (Conv2D) (None, 1, 1, 12) 2316 ['block4a_se_reshape[0][0]']
block4a_se_expand (Conv2D) (None, 1, 1, 192) 2496 ['block4a_se_reduce[0][0]']
block4a_se_excite (Multiply) (None, 14, 14, 192) 0 ['block4a_activation[0][0]',
'block4a_se_expand[0][0]']
block4a_project_conv (Conv2D) (None, 14, 14, 96) 18432 ['block4a_se_excite[0][0]']
block4a_project_bn (BatchNorma (None, 14, 14, 96) 384 ['block4a_project_conv[0][0]']
lization)
block4b_expand_conv (Conv2D) (None, 14, 14, 384) 36864 ['block4a_project_bn[0][0]']
block4b_expand_bn (BatchNormal (None, 14, 14, 384) 1536 ['block4b_expand_conv[0][0]']
ization)
block4b_expand_activation (Act (None, 14, 14, 384) 0 ['block4b_expand_bn[0][0]']
ivation)
block4b_dwconv2 (DepthwiseConv (None, 14, 14, 384) 3456 ['block4b_expand_activation[0][0]
2D) ']
block4b_bn (BatchNormalization (None, 14, 14, 384) 1536 ['block4b_dwconv2[0][0]']
)
block4b_activation (Activation (None, 14, 14, 384) 0 ['block4b_bn[0][0]']
)
block4b_se_squeeze (GlobalAver (None, 384) 0 ['block4b_activation[0][0]']
agePooling2D)
block4b_se_reshape (Reshape) (None, 1, 1, 384) 0 ['block4b_se_squeeze[0][0]']
block4b_se_reduce (Conv2D) (None, 1, 1, 24) 9240 ['block4b_se_reshape[0][0]']
block4b_se_expand (Conv2D) (None, 1, 1, 384) 9600 ['block4b_se_reduce[0][0]']
block4b_se_excite (Multiply) (None, 14, 14, 384) 0 ['block4b_activation[0][0]',
'block4b_se_expand[0][0]']
block4b_project_conv (Conv2D) (None, 14, 14, 96) 36864 ['block4b_se_excite[0][0]']
block4b_project_bn (BatchNorma (None, 14, 14, 96) 384 ['block4b_project_conv[0][0]']
lization)
block4b_drop (Dropout) (None, 14, 14, 96) 0 ['block4b_project_bn[0][0]']
block4b_add (Add) (None, 14, 14, 96) 0 ['block4b_drop[0][0]',
'block4a_project_bn[0][0]']
block4c_expand_conv (Conv2D) (None, 14, 14, 384) 36864 ['block4b_add[0][0]']
block4c_expand_bn (BatchNormal (None, 14, 14, 384) 1536 ['block4c_expand_conv[0][0]']
ization)
block4c_expand_activation (Act (None, 14, 14, 384) 0 ['block4c_expand_bn[0][0]']
ivation)
block4c_dwconv2 (DepthwiseConv (None, 14, 14, 384) 3456 ['block4c_expand_activation[0][0]
2D) ']
block4c_bn (BatchNormalization (None, 14, 14, 384) 1536 ['block4c_dwconv2[0][0]']
)
block4c_activation (Activation (None, 14, 14, 384) 0 ['block4c_bn[0][0]']
)
block4c_se_squeeze (GlobalAver (None, 384) 0 ['block4c_activation[0][0]']
agePooling2D)
block4c_se_reshape (Reshape) (None, 1, 1, 384) 0 ['block4c_se_squeeze[0][0]']
block4c_se_reduce (Conv2D) (None, 1, 1, 24) 9240 ['block4c_se_reshape[0][0]']
block4c_se_expand (Conv2D) (None, 1, 1, 384) 9600 ['block4c_se_reduce[0][0]']
block4c_se_excite (Multiply) (None, 14, 14, 384) 0 ['block4c_activation[0][0]',
'block4c_se_expand[0][0]']
block4c_project_conv (Conv2D) (None, 14, 14, 96) 36864 ['block4c_se_excite[0][0]']
block4c_project_bn (BatchNorma (None, 14, 14, 96) 384 ['block4c_project_conv[0][0]']
lization)
block4c_drop (Dropout) (None, 14, 14, 96) 0 ['block4c_project_bn[0][0]']
block4c_add (Add) (None, 14, 14, 96) 0 ['block4c_drop[0][0]',
'block4b_add[0][0]']
block5a_expand_conv (Conv2D) (None, 14, 14, 576) 55296 ['block4c_add[0][0]']
block5a_expand_bn (BatchNormal (None, 14, 14, 576) 2304 ['block5a_expand_conv[0][0]']
ization)
block5a_expand_activation (Act (None, 14, 14, 576) 0 ['block5a_expand_bn[0][0]']
ivation)
block5a_dwconv2 (DepthwiseConv (None, 14, 14, 576) 5184 ['block5a_expand_activation[0][0]
2D) ']
block5a_bn (BatchNormalization (None, 14, 14, 576) 2304 ['block5a_dwconv2[0][0]']
)
block5a_activation (Activation (None, 14, 14, 576) 0 ['block5a_bn[0][0]']
)
block5a_se_squeeze (GlobalAver (None, 576) 0 ['block5a_activation[0][0]']
agePooling2D)
block5a_se_reshape (Reshape) (None, 1, 1, 576) 0 ['block5a_se_squeeze[0][0]']
block5a_se_reduce (Conv2D) (None, 1, 1, 24) 13848 ['block5a_se_reshape[0][0]']
block5a_se_expand (Conv2D) (None, 1, 1, 576) 14400 ['block5a_se_reduce[0][0]']
block5a_se_excite (Multiply) (None, 14, 14, 576) 0 ['block5a_activation[0][0]',
'block5a_se_expand[0][0]']
block5a_project_conv (Conv2D) (None, 14, 14, 112) 64512 ['block5a_se_excite[0][0]']
block5a_project_bn (BatchNorma (None, 14, 14, 112) 448 ['block5a_project_conv[0][0]']
lization)
block5b_expand_conv (Conv2D) (None, 14, 14, 672) 75264 ['block5a_project_bn[0][0]']
block5b_expand_bn (BatchNormal (None, 14, 14, 672) 2688 ['block5b_expand_conv[0][0]']
ization)
block5b_expand_activation (Act (None, 14, 14, 672) 0 ['block5b_expand_bn[0][0]']
ivation)
block5b_dwconv2 (DepthwiseConv (None, 14, 14, 672) 6048 ['block5b_expand_activation[0][0]
2D) ']
block5b_bn (BatchNormalization (None, 14, 14, 672) 2688 ['block5b_dwconv2[0][0]']
)
block5b_activation (Activation (None, 14, 14, 672) 0 ['block5b_bn[0][0]']
)
block5b_se_squeeze (GlobalAver (None, 672) 0 ['block5b_activation[0][0]']
agePooling2D)
block5b_se_reshape (Reshape) (None, 1, 1, 672) 0 ['block5b_se_squeeze[0][0]']
block5b_se_reduce (Conv2D) (None, 1, 1, 28) 18844 ['block5b_se_reshape[0][0]']
block5b_se_expand (Conv2D) (None, 1, 1, 672) 19488 ['block5b_se_reduce[0][0]']
block5b_se_excite (Multiply) (None, 14, 14, 672) 0 ['block5b_activation[0][0]',
'block5b_se_expand[0][0]']
block5b_project_conv (Conv2D) (None, 14, 14, 112) 75264 ['block5b_se_excite[0][0]']
block5b_project_bn (BatchNorma (None, 14, 14, 112) 448 ['block5b_project_conv[0][0]']
lization)
block5b_drop (Dropout) (None, 14, 14, 112) 0 ['block5b_project_bn[0][0]']
block5b_add (Add) (None, 14, 14, 112) 0 ['block5b_drop[0][0]',
'block5a_project_bn[0][0]']
block5c_expand_conv (Conv2D) (None, 14, 14, 672) 75264 ['block5b_add[0][0]']
block5c_expand_bn (BatchNormal (None, 14, 14, 672) 2688 ['block5c_expand_conv[0][0]']
ization)
block5c_expand_activation (Act (None, 14, 14, 672) 0 ['block5c_expand_bn[0][0]']
ivation)
block5c_dwconv2 (DepthwiseConv (None, 14, 14, 672) 6048 ['block5c_expand_activation[0][0]
2D) ']
block5c_bn (BatchNormalization (None, 14, 14, 672) 2688 ['block5c_dwconv2[0][0]']
)
block5c_activation (Activation (None, 14, 14, 672) 0 ['block5c_bn[0][0]']
)
block5c_se_squeeze (GlobalAver (None, 672) 0 ['block5c_activation[0][0]']
agePooling2D)
block5c_se_reshape (Reshape) (None, 1, 1, 672) 0 ['block5c_se_squeeze[0][0]']
block5c_se_reduce (Conv2D) (None, 1, 1, 28) 18844 ['block5c_se_reshape[0][0]']
block5c_se_expand (Conv2D) (None, 1, 1, 672) 19488 ['block5c_se_reduce[0][0]']
block5c_se_excite (Multiply) (None, 14, 14, 672) 0 ['block5c_activation[0][0]',
'block5c_se_expand[0][0]']
block5c_project_conv (Conv2D) (None, 14, 14, 112) 75264 ['block5c_se_excite[0][0]']
block5c_project_bn (BatchNorma (None, 14, 14, 112) 448 ['block5c_project_conv[0][0]']
lization)
block5c_drop (Dropout) (None, 14, 14, 112) 0 ['block5c_project_bn[0][0]']
block5c_add (Add) (None, 14, 14, 112) 0 ['block5c_drop[0][0]',
'block5b_add[0][0]']
block5d_expand_conv (Conv2D) (None, 14, 14, 672) 75264 ['block5c_add[0][0]']
block5d_expand_bn (BatchNormal (None, 14, 14, 672) 2688 ['block5d_expand_conv[0][0]']
ization)
block5d_expand_activation (Act (None, 14, 14, 672) 0 ['block5d_expand_bn[0][0]']
ivation)
block5d_dwconv2 (DepthwiseConv (None, 14, 14, 672) 6048 ['block5d_expand_activation[0][0]
2D) ']
block5d_bn (BatchNormalization (None, 14, 14, 672) 2688 ['block5d_dwconv2[0][0]']
)
block5d_activation (Activation (None, 14, 14, 672) 0 ['block5d_bn[0][0]']
)
block5d_se_squeeze (GlobalAver (None, 672) 0 ['block5d_activation[0][0]']
agePooling2D)
block5d_se_reshape (Reshape) (None, 1, 1, 672) 0 ['block5d_se_squeeze[0][0]']
block5d_se_reduce (Conv2D) (None, 1, 1, 28) 18844 ['block5d_se_reshape[0][0]']
block5d_se_expand (Conv2D) (None, 1, 1, 672) 19488 ['block5d_se_reduce[0][0]']
block5d_se_excite (Multiply) (None, 14, 14, 672) 0 ['block5d_activation[0][0]',
'block5d_se_expand[0][0]']
block5d_project_conv (Conv2D) (None, 14, 14, 112) 75264 ['block5d_se_excite[0][0]']
block5d_project_bn (BatchNorma (None, 14, 14, 112) 448 ['block5d_project_conv[0][0]']
lization)
block5d_drop (Dropout) (None, 14, 14, 112) 0 ['block5d_project_bn[0][0]']
block5d_add (Add) (None, 14, 14, 112) 0 ['block5d_drop[0][0]',
'block5c_add[0][0]']
block5e_expand_conv (Conv2D) (None, 14, 14, 672) 75264 ['block5d_add[0][0]']
block5e_expand_bn (BatchNormal (None, 14, 14, 672) 2688 ['block5e_expand_conv[0][0]']
ization)
block5e_expand_activation (Act (None, 14, 14, 672) 0 ['block5e_expand_bn[0][0]']
ivation)
block5e_dwconv2 (DepthwiseConv (None, 14, 14, 672) 6048 ['block5e_expand_activation[0][0]
2D) ']
block5e_bn (BatchNormalization (None, 14, 14, 672) 2688 ['block5e_dwconv2[0][0]']
)
block5e_activation (Activation (None, 14, 14, 672) 0 ['block5e_bn[0][0]']
)
block5e_se_squeeze (GlobalAver (None, 672) 0 ['block5e_activation[0][0]']
agePooling2D)
block5e_se_reshape (Reshape) (None, 1, 1, 672) 0 ['block5e_se_squeeze[0][0]']
block5e_se_reduce (Conv2D) (None, 1, 1, 28) 18844 ['block5e_se_reshape[0][0]']
block5e_se_expand (Conv2D) (None, 1, 1, 672) 19488 ['block5e_se_reduce[0][0]']
block5e_se_excite (Multiply) (None, 14, 14, 672) 0 ['block5e_activation[0][0]',
'block5e_se_expand[0][0]']
block5e_project_conv (Conv2D) (None, 14, 14, 112) 75264 ['block5e_se_excite[0][0]']
block5e_project_bn (BatchNorma (None, 14, 14, 112) 448 ['block5e_project_conv[0][0]']
lization)
block5e_drop (Dropout) (None, 14, 14, 112) 0 ['block5e_project_bn[0][0]']
block5e_add (Add) (None, 14, 14, 112) 0 ['block5e_drop[0][0]',
'block5d_add[0][0]']
block6a_expand_conv (Conv2D) (None, 14, 14, 672) 75264 ['block5e_add[0][0]']
block6a_expand_bn (BatchNormal (None, 14, 14, 672) 2688 ['block6a_expand_conv[0][0]']
ization)
block6a_expand_activation (Act (None, 14, 14, 672) 0 ['block6a_expand_bn[0][0]']
ivation)
block6a_dwconv2 (DepthwiseConv (None, 7, 7, 672) 6048 ['block6a_expand_activation[0][0]
2D) ']
block6a_bn (BatchNormalization (None, 7, 7, 672) 2688 ['block6a_dwconv2[0][0]']
)
block6a_activation (Activation (None, 7, 7, 672) 0 ['block6a_bn[0][0]']
)
block6a_se_squeeze (GlobalAver (None, 672) 0 ['block6a_activation[0][0]']
agePooling2D)
block6a_se_reshape (Reshape) (None, 1, 1, 672) 0 ['block6a_se_squeeze[0][0]']
block6a_se_reduce (Conv2D) (None, 1, 1, 28) 18844 ['block6a_se_reshape[0][0]']
block6a_se_expand (Conv2D) (None, 1, 1, 672) 19488 ['block6a_se_reduce[0][0]']
block6a_se_excite (Multiply) (None, 7, 7, 672) 0 ['block6a_activation[0][0]',
'block6a_se_expand[0][0]']
block6a_project_conv (Conv2D) (None, 7, 7, 192) 129024 ['block6a_se_excite[0][0]']
block6a_project_bn (BatchNorma (None, 7, 7, 192) 768 ['block6a_project_conv[0][0]']
lization)
block6b_expand_conv (Conv2D) (None, 7, 7, 1152) 221184 ['block6a_project_bn[0][0]']
block6b_expand_bn (BatchNormal (None, 7, 7, 1152) 4608 ['block6b_expand_conv[0][0]']
ization)
block6b_expand_activation (Act (None, 7, 7, 1152) 0 ['block6b_expand_bn[0][0]']
ivation)
block6b_dwconv2 (DepthwiseConv (None, 7, 7, 1152) 10368 ['block6b_expand_activation[0][0]
2D) ']
block6b_bn (BatchNormalization (None, 7, 7, 1152) 4608 ['block6b_dwconv2[0][0]']
)
block6b_activation (Activation (None, 7, 7, 1152) 0 ['block6b_bn[0][0]']
)
block6b_se_squeeze (GlobalAver (None, 1152) 0 ['block6b_activation[0][0]']
agePooling2D)
block6b_se_reshape (Reshape) (None, 1, 1, 1152) 0 ['block6b_se_squeeze[0][0]']
block6b_se_reduce (Conv2D) (None, 1, 1, 48) 55344 ['block6b_se_reshape[0][0]']
block6b_se_expand (Conv2D) (None, 1, 1, 1152) 56448 ['block6b_se_reduce[0][0]']
block6b_se_excite (Multiply) (None, 7, 7, 1152) 0 ['block6b_activation[0][0]',
'block6b_se_expand[0][0]']
block6b_project_conv (Conv2D) (None, 7, 7, 192) 221184 ['block6b_se_excite[0][0]']
block6b_project_bn (BatchNorma (None, 7, 7, 192) 768 ['block6b_project_conv[0][0]']
lization)
block6b_drop (Dropout) (None, 7, 7, 192) 0 ['block6b_project_bn[0][0]']
block6b_add (Add) (None, 7, 7, 192) 0 ['block6b_drop[0][0]',
'block6a_project_bn[0][0]']
block6c_expand_conv (Conv2D) (None, 7, 7, 1152) 221184 ['block6b_add[0][0]']
block6c_expand_bn (BatchNormal (None, 7, 7, 1152) 4608 ['block6c_expand_conv[0][0]']
ization)
block6c_expand_activation (Act (None, 7, 7, 1152) 0 ['block6c_expand_bn[0][0]']
ivation)
block6c_dwconv2 (DepthwiseConv (None, 7, 7, 1152) 10368 ['block6c_expand_activation[0][0]
2D) ']
block6c_bn (BatchNormalization (None, 7, 7, 1152) 4608 ['block6c_dwconv2[0][0]']
)
block6c_activation (Activation (None, 7, 7, 1152) 0 ['block6c_bn[0][0]']
)
block6c_se_squeeze (GlobalAver (None, 1152) 0 ['block6c_activation[0][0]']
agePooling2D)
block6c_se_reshape (Reshape) (None, 1, 1, 1152) 0 ['block6c_se_squeeze[0][0]']
block6c_se_reduce (Conv2D) (None, 1, 1, 48) 55344 ['block6c_se_reshape[0][0]']
block6c_se_expand (Conv2D) (None, 1, 1, 1152) 56448 ['block6c_se_reduce[0][0]']
block6c_se_excite (Multiply) (None, 7, 7, 1152) 0 ['block6c_activation[0][0]',
'block6c_se_expand[0][0]']
block6c_project_conv (Conv2D) (None, 7, 7, 192) 221184 ['block6c_se_excite[0][0]']
block6c_project_bn (BatchNorma (None, 7, 7, 192) 768 ['block6c_project_conv[0][0]']
lization)
block6c_drop (Dropout) (None, 7, 7, 192) 0 ['block6c_project_bn[0][0]']
block6c_add (Add) (None, 7, 7, 192) 0 ['block6c_drop[0][0]',
'block6b_add[0][0]']
block6d_expand_conv (Conv2D) (None, 7, 7, 1152) 221184 ['block6c_add[0][0]']
block6d_expand_bn (BatchNormal (None, 7, 7, 1152) 4608 ['block6d_expand_conv[0][0]']
ization)
block6d_expand_activation (Act (None, 7, 7, 1152) 0 ['block6d_expand_bn[0][0]']
ivation)
block6d_dwconv2 (DepthwiseConv (None, 7, 7, 1152) 10368 ['block6d_expand_activation[0][0]
2D) ']
block6d_bn (BatchNormalization (None, 7, 7, 1152) 4608 ['block6d_dwconv2[0][0]']
)
block6d_activation (Activation (None, 7, 7, 1152) 0 ['block6d_bn[0][0]']
)
block6d_se_squeeze (GlobalAver (None, 1152) 0 ['block6d_activation[0][0]']
agePooling2D)
block6d_se_reshape (Reshape) (None, 1, 1, 1152) 0 ['block6d_se_squeeze[0][0]']
block6d_se_reduce (Conv2D) (None, 1, 1, 48) 55344 ['block6d_se_reshape[0][0]']
block6d_se_expand (Conv2D) (None, 1, 1, 1152) 56448 ['block6d_se_reduce[0][0]']
block6d_se_excite (Multiply) (None, 7, 7, 1152) 0 ['block6d_activation[0][0]',
'block6d_se_expand[0][0]']
block6d_project_conv (Conv2D) (None, 7, 7, 192) 221184 ['block6d_se_excite[0][0]']
block6d_project_bn (BatchNorma (None, 7, 7, 192) 768 ['block6d_project_conv[0][0]']
lization)
block6d_drop (Dropout) (None, 7, 7, 192) 0 ['block6d_project_bn[0][0]']
block6d_add (Add) (None, 7, 7, 192) 0 ['block6d_drop[0][0]',
'block6c_add[0][0]']
block6e_expand_conv (Conv2D) (None, 7, 7, 1152) 221184 ['block6d_add[0][0]']
block6e_expand_bn (BatchNormal (None, 7, 7, 1152) 4608 ['block6e_expand_conv[0][0]']
ization)
block6e_expand_activation (Act (None, 7, 7, 1152) 0 ['block6e_expand_bn[0][0]']
ivation)
block6e_dwconv2 (DepthwiseConv (None, 7, 7, 1152) 10368 ['block6e_expand_activation[0][0]
2D) ']
block6e_bn (BatchNormalization (None, 7, 7, 1152) 4608 ['block6e_dwconv2[0][0]']
)
block6e_activation (Activation (None, 7, 7, 1152) 0 ['block6e_bn[0][0]']
)
block6e_se_squeeze (GlobalAver (None, 1152) 0 ['block6e_activation[0][0]']
agePooling2D)
block6e_se_reshape (Reshape) (None, 1, 1, 1152) 0 ['block6e_se_squeeze[0][0]']
block6e_se_reduce (Conv2D) (None, 1, 1, 48) 55344 ['block6e_se_reshape[0][0]']
block6e_se_expand (Conv2D) (None, 1, 1, 1152) 56448 ['block6e_se_reduce[0][0]']
block6e_se_excite (Multiply) (None, 7, 7, 1152) 0 ['block6e_activation[0][0]',
'block6e_se_expand[0][0]']
block6e_project_conv (Conv2D) (None, 7, 7, 192) 221184 ['block6e_se_excite[0][0]']
block6e_project_bn (BatchNorma (None, 7, 7, 192) 768 ['block6e_project_conv[0][0]']
lization)
block6e_drop (Dropout) (None, 7, 7, 192) 0 ['block6e_project_bn[0][0]']
block6e_add (Add) (None, 7, 7, 192) 0 ['block6e_drop[0][0]',
'block6d_add[0][0]']
block6f_expand_conv (Conv2D) (None, 7, 7, 1152) 221184 ['block6e_add[0][0]']
block6f_expand_bn (BatchNormal (None, 7, 7, 1152) 4608 ['block6f_expand_conv[0][0]']
ization)
block6f_expand_activation (Act (None, 7, 7, 1152) 0 ['block6f_expand_bn[0][0]']
ivation)
block6f_dwconv2 (DepthwiseConv (None, 7, 7, 1152) 10368 ['block6f_expand_activation[0][0]
2D) ']
block6f_bn (BatchNormalization (None, 7, 7, 1152) 4608 ['block6f_dwconv2[0][0]']
)
block6f_activation (Activation (None, 7, 7, 1152) 0 ['block6f_bn[0][0]']
)
block6f_se_squeeze (GlobalAver (None, 1152) 0 ['block6f_activation[0][0]']
agePooling2D)
block6f_se_reshape (Reshape) (None, 1, 1, 1152) 0 ['block6f_se_squeeze[0][0]']
block6f_se_reduce (Conv2D) (None, 1, 1, 48) 55344 ['block6f_se_reshape[0][0]']
block6f_se_expand (Conv2D) (None, 1, 1, 1152) 56448 ['block6f_se_reduce[0][0]']
block6f_se_excite (Multiply) (None, 7, 7, 1152) 0 ['block6f_activation[0][0]',
'block6f_se_expand[0][0]']
block6f_project_conv (Conv2D) (None, 7, 7, 192) 221184 ['block6f_se_excite[0][0]']
block6f_project_bn (BatchNorma (None, 7, 7, 192) 768 ['block6f_project_conv[0][0]']
lization)
block6f_drop (Dropout) (None, 7, 7, 192) 0 ['block6f_project_bn[0][0]']
block6f_add (Add) (None, 7, 7, 192) 0 ['block6f_drop[0][0]',
'block6e_add[0][0]']
block6g_expand_conv (Conv2D) (None, 7, 7, 1152) 221184 ['block6f_add[0][0]']
block6g_expand_bn (BatchNormal (None, 7, 7, 1152) 4608 ['block6g_expand_conv[0][0]']
ization)
block6g_expand_activation (Act (None, 7, 7, 1152) 0 ['block6g_expand_bn[0][0]']
ivation)
block6g_dwconv2 (DepthwiseConv (None, 7, 7, 1152) 10368 ['block6g_expand_activation[0][0]
2D) ']
block6g_bn (BatchNormalization (None, 7, 7, 1152) 4608 ['block6g_dwconv2[0][0]']
)
block6g_activation (Activation (None, 7, 7, 1152) 0 ['block6g_bn[0][0]']
)
block6g_se_squeeze (GlobalAver (None, 1152) 0 ['block6g_activation[0][0]']
agePooling2D)
block6g_se_reshape (Reshape) (None, 1, 1, 1152) 0 ['block6g_se_squeeze[0][0]']
block6g_se_reduce (Conv2D) (None, 1, 1, 48) 55344 ['block6g_se_reshape[0][0]']
block6g_se_expand (Conv2D) (None, 1, 1, 1152) 56448 ['block6g_se_reduce[0][0]']
block6g_se_excite (Multiply) (None, 7, 7, 1152) 0 ['block6g_activation[0][0]',
'block6g_se_expand[0][0]']
block6g_project_conv (Conv2D) (None, 7, 7, 192) 221184 ['block6g_se_excite[0][0]']
block6g_project_bn (BatchNorma (None, 7, 7, 192) 768 ['block6g_project_conv[0][0]']
lization)
block6g_drop (Dropout) (None, 7, 7, 192) 0 ['block6g_project_bn[0][0]']
block6g_add (Add) (None, 7, 7, 192) 0 ['block6g_drop[0][0]',
'block6f_add[0][0]']
block6h_expand_conv (Conv2D) (None, 7, 7, 1152) 221184 ['block6g_add[0][0]']
block6h_expand_bn (BatchNormal (None, 7, 7, 1152) 4608 ['block6h_expand_conv[0][0]']
ization)
block6h_expand_activation (Act (None, 7, 7, 1152) 0 ['block6h_expand_bn[0][0]']
ivation)
block6h_dwconv2 (DepthwiseConv (None, 7, 7, 1152) 10368 ['block6h_expand_activation[0][0]
2D) ']
block6h_bn (BatchNormalization (None, 7, 7, 1152) 4608 ['block6h_dwconv2[0][0]']
)
block6h_activation (Activation (None, 7, 7, 1152) 0 ['block6h_bn[0][0]']
)
block6h_se_squeeze (GlobalAver (None, 1152) 0 ['block6h_activation[0][0]']
agePooling2D)
block6h_se_reshape (Reshape) (None, 1, 1, 1152) 0 ['block6h_se_squeeze[0][0]']
block6h_se_reduce (Conv2D) (None, 1, 1, 48) 55344 ['block6h_se_reshape[0][0]']
block6h_se_expand (Conv2D) (None, 1, 1, 1152) 56448 ['block6h_se_reduce[0][0]']
block6h_se_excite (Multiply) (None, 7, 7, 1152) 0 ['block6h_activation[0][0]',
'block6h_se_expand[0][0]']
block6h_project_conv (Conv2D) (None, 7, 7, 192) 221184 ['block6h_se_excite[0][0]']
block6h_project_bn (BatchNorma (None, 7, 7, 192) 768 ['block6h_project_conv[0][0]']
lization)
block6h_drop (Dropout) (None, 7, 7, 192) 0 ['block6h_project_bn[0][0]']
block6h_add (Add) (None, 7, 7, 192) 0 ['block6h_drop[0][0]',
'block6g_add[0][0]']
top_conv (Conv2D) (None, 7, 7, 1280) 245760 ['block6h_add[0][0]']
top_bn (BatchNormalization) (None, 7, 7, 1280) 5120 ['top_conv[0][0]']
top_activation (Activation) (None, 7, 7, 1280) 0 ['top_bn[0][0]']
Flatten_for_hidden_layers (Glo (None, 1280) 0 ['top_activation[0][0]']
balAveragePooling2D)
Dropout1 (Dropout) (None, 1280) 0 ['Flatten_for_hidden_layers[0][0]
']
Hidden_Layer1 (Dense) (None, 128) 163968 ['Dropout1[0][0]']
Dropout2 (Dropout) (None, 128) 0 ['Hidden_Layer1[0][0]']
Hidden_Layer2 (Dense) (None, 128) 16512 ['Dropout2[0][0]']
Dropout3 (Dropout) (None, 128) 0 ['Hidden_Layer2[0][0]']
Hidden_Layer3 (Dense) (None, 128) 16512 ['Dropout3[0][0]']
output (Dense) (None, 4) 516 ['Hidden_Layer3[0][0]']
==================================================================================================
Total params: 6,116,820
Trainable params: 6,056,212
Non-trainable params: 60,608
__________________________________________________________________________________________________
# plotting the model
plot_model(eff_net_v2_b0, to_file='eff_net_v2_b0.png', show_shapes=True, show_layer_names=True)
tf.autograph.experimental.do_not_convert(func=None)
<function tensorflow.python.autograph.impl.api.do_not_convert(func=None)>
# Size of TRAIN & VALIDATION labels and BATCH SIZE
y_train.shape[0], BATCH_SIZE, y_val.shape[0]
(11138, 32, 1966)
# Calculating train steps
train_steps = y_train.shape[0] // BATCH_SIZE
train_steps
348
# Calculating test steps
valid_steps = y_val.shape[0] // BATCH_SIZE
valid_steps
61
cw1_dict
{0: 0.8862189688096753,
1: 4.954626334519573,
2: 0.7281642259414226,
3: 0.7713296398891967}
# Training the EffNet V2 B0 model having custom top
history5 = eff_net_v2_b0.fit(X_train, y_train,
epochs=18,
batch_size=BATCH_SIZE,
callbacks=[tensorboard_callback5, reduce_lr5],
steps_per_epoch=train_steps,
validation_steps=valid_steps,
validation_data=[X_val, y_val],
class_weight=cw1_dict,
verbose=1)
Epoch 1/18 348/348 [==============================] - 90s 200ms/step - loss: 1.3910 - categorical_accuracy: 0.3504 - f1_score: 0.3110 - val_loss: 1.0243 - val_categorical_accuracy: 0.7341 - val_f1_score: 0.5721 - lr: 1.0000e-04 Epoch 2/18 348/348 [==============================] - 68s 192ms/step - loss: 0.8989 - categorical_accuracy: 0.6900 - f1_score: 0.6072 - val_loss: 0.7749 - val_categorical_accuracy: 0.7669 - val_f1_score: 0.5907 - lr: 1.0000e-04 Epoch 3/18 348/348 [==============================] - 67s 191ms/step - loss: 0.5490 - categorical_accuracy: 0.8330 - f1_score: 0.7442 - val_loss: 0.4867 - val_categorical_accuracy: 0.8422 - val_f1_score: 0.7582 - lr: 1.0000e-04 Epoch 4/18 348/348 [==============================] - 67s 191ms/step - loss: 0.3328 - categorical_accuracy: 0.8987 - f1_score: 0.8310 - val_loss: 0.4060 - val_categorical_accuracy: 0.8601 - val_f1_score: 0.7737 - lr: 1.0000e-04 Epoch 5/18 348/348 [==============================] - 67s 191ms/step - loss: 0.2213 - categorical_accuracy: 0.9326 - f1_score: 0.8804 - val_loss: 0.1334 - val_categorical_accuracy: 0.9606 - val_f1_score: 0.9212 - lr: 1.0000e-04 Epoch 6/18 348/348 [==============================] - 67s 192ms/step - loss: 0.1404 - categorical_accuracy: 0.9562 - f1_score: 0.9200 - val_loss: 0.2580 - val_categorical_accuracy: 0.9303 - val_f1_score: 0.8171 - lr: 1.0000e-04 Epoch 7/18 348/348 [==============================] - ETA: 0s - loss: 0.1120 - categorical_accuracy: 0.9678 - f1_score: 0.9417 Epoch 7: ReduceLROnPlateau reducing learning rate to 9.999999747378752e-06. 348/348 [==============================] - 67s 191ms/step - loss: 0.1120 - categorical_accuracy: 0.9678 - f1_score: 0.9417 - val_loss: 0.1397 - val_categorical_accuracy: 0.9534 - val_f1_score: 0.8870 - lr: 1.0000e-04 Epoch 8/18 348/348 [==============================] - 67s 191ms/step - loss: 0.0740 - categorical_accuracy: 0.9788 - f1_score: 0.9625 - val_loss: 0.0274 - val_categorical_accuracy: 0.9913 - val_f1_score: 0.9789 - lr: 1.0000e-05 Epoch 9/18 348/348 [==============================] - 67s 192ms/step - loss: 0.0639 - categorical_accuracy: 0.9817 - f1_score: 0.9663 - val_loss: 0.0258 - val_categorical_accuracy: 0.9918 - val_f1_score: 0.9816 - lr: 1.0000e-05 Epoch 10/18 348/348 [==============================] - 67s 191ms/step - loss: 0.0539 - categorical_accuracy: 0.9838 - f1_score: 0.9685 - val_loss: 0.0251 - val_categorical_accuracy: 0.9939 - val_f1_score: 0.9864 - lr: 1.0000e-05 Epoch 11/18 348/348 [==============================] - 67s 191ms/step - loss: 0.0604 - categorical_accuracy: 0.9826 - f1_score: 0.9684 - val_loss: 0.0312 - val_categorical_accuracy: 0.9903 - val_f1_score: 0.9777 - lr: 1.0000e-05 Epoch 12/18 348/348 [==============================] - ETA: 0s - loss: 0.0451 - categorical_accuracy: 0.9864 - f1_score: 0.9751 Epoch 12: ReduceLROnPlateau reducing learning rate to 9.999999747378752e-07. 348/348 [==============================] - 67s 192ms/step - loss: 0.0451 - categorical_accuracy: 0.9864 - f1_score: 0.9751 - val_loss: 0.0319 - val_categorical_accuracy: 0.9892 - val_f1_score: 0.9748 - lr: 1.0000e-05 Epoch 13/18 348/348 [==============================] - 67s 192ms/step - loss: 0.0470 - categorical_accuracy: 0.9858 - f1_score: 0.9738 - val_loss: 0.0225 - val_categorical_accuracy: 0.9933 - val_f1_score: 0.9849 - lr: 1.0000e-06 Epoch 14/18 348/348 [==============================] - 67s 191ms/step - loss: 0.0425 - categorical_accuracy: 0.9881 - f1_score: 0.9781 - val_loss: 0.0184 - val_categorical_accuracy: 0.9939 - val_f1_score: 0.9863 - lr: 1.0000e-06 Epoch 15/18 348/348 [==============================] - 67s 191ms/step - loss: 0.0468 - categorical_accuracy: 0.9874 - f1_score: 0.9772 - val_loss: 0.0214 - val_categorical_accuracy: 0.9933 - val_f1_score: 0.9860 - lr: 1.0000e-06 Epoch 16/18 348/348 [==============================] - ETA: 0s - loss: 0.0382 - categorical_accuracy: 0.9892 - f1_score: 0.9834 Epoch 16: ReduceLROnPlateau reducing learning rate to 9.999999974752428e-08. 348/348 [==============================] - 67s 192ms/step - loss: 0.0382 - categorical_accuracy: 0.9892 - f1_score: 0.9834 - val_loss: 0.0236 - val_categorical_accuracy: 0.9933 - val_f1_score: 0.9849 - lr: 1.0000e-06 Epoch 17/18 348/348 [==============================] - 67s 192ms/step - loss: 0.0447 - categorical_accuracy: 0.9865 - f1_score: 0.9755 - val_loss: 0.0232 - val_categorical_accuracy: 0.9939 - val_f1_score: 0.9864 - lr: 1.0000e-07 Epoch 18/18 348/348 [==============================] - ETA: 0s - loss: 0.0424 - categorical_accuracy: 0.9870 - f1_score: 0.9796 Epoch 18: ReduceLROnPlateau reducing learning rate to 1.0000000116860975e-08. 348/348 [==============================] - 67s 192ms/step - loss: 0.0424 - categorical_accuracy: 0.9870 - f1_score: 0.9796 - val_loss: 0.0213 - val_categorical_accuracy: 0.9939 - val_f1_score: 0.9864 - lr: 1.0000e-07
# Actual TGT classes distribution in validation set
# print("\n:::: Validation Set ====> ACTUAL TGT Classes Distribution ::::\n")
# display(val_tgt_classes_dist)
# Plotting the Final Results on Validation Set
print("\n:::: Validation Set ====> PREDICTION Confusion Matrix ::::\n")
eff_net_v2_b0_global_tuning_val_results = confusion_matrix_(y_val, X_val, eff_net_v2_b0)
# Displaying the overall performance results
print("\n:::: Validation Set ====> FINAL Results ::::\n")
display(eff_net_v2_b0_global_tuning_val_results)
:::: Validation Set ====> PREDICTION Confusion Matrix :::: 62/62 [==============================] - 5s 56ms/step
:::: Validation Set ====> FINAL Results ::::
| Healthy | Multiple_Diseases | Rust | Scab | |
|---|---|---|---|---|
| BINARY Accuracy | 0.9980 | 0.9969 | 0.9976 | 0.9969 |
| Precision | 0.9948 | 0.9881 | 0.9932 | 0.9939 |
| Recall | 0.9983 | 0.9940 | 0.9962 | 0.9939 |
| Macro F1 Score | 0.9975 | 0.9781 | 0.9988 | 0.9942 |
| Macro ROC AUC Score | 0.9981 | 0.9829 | 0.9988 | 0.9935 |
OBSERVATIONS
# Actual TGT classes distribution in TEST set
# print("\n:::: TEST Set ====> ACTUAL TGT Classes Distribution ::::\n")
# display(val_tgt_classes_dist)
# Plotting the Results on TEST Set
print("\n:::: TEST Set ====> PREDICTION Confusion Matrix ::::\n")
eff_net_v2_b0_global_tuning_test_results = confusion_matrix_(y_test, X_test, eff_net_v2_b0)
# Displaying the overall performance results
print("\n:::: TEST Set ====> FINAL Results ::::\n")
display(eff_net_v2_b0_global_tuning_test_results)
:::: TEST Set ====> PREDICTION Confusion Matrix :::: 12/12 [==============================] - 1s 89ms/step
:::: TEST Set ====> FINAL Results ::::
| Healthy | Multiple_Diseases | Rust | Scab | |
|---|---|---|---|---|
| BINARY Accuracy | 0.7890 | 0.8466 | 0.8849 | 0.8699 |
| Precision | 0.5783 | 0.5236 | 0.6961 | 0.7397 |
| Recall | 0.9320 | 0.8264 | 0.8659 | 0.7397 |
| Macro F1 Score | 0.7734 | 0.5676 | 0.9565 | 0.7623 |
| Macro ROC AUC Score | 0.8324 | 0.5809 | 0.9478 | 0.7354 |
OBSERVATIONS
Rusty, Scab and Healthy images.Multiple diseases class that has the lowest +ve cases, it is making some false positives and negatives. And, not competent enough for identifying the multiple diseased imagescurr_run_logdir5.split("/")[-1]
'run_2022_11_07-04_12_40'
notebook.list()
No known TensorBoard instances running.
%tensorboard --logdir logs

OBSERVATIONS
OVERALL_RESULTS¶
SUMMARY¶DenseNet-121 & MobileNet V3 Small has performed well as compared to ResNet-50 & other models.